AI DevWorld
Tuesday, October 25, 2022
PRO Workshop (AI): Product Led Growth: A new paradigm shift in Data Science and Product Manager Collaboration
Data Science in industry requires close collaboration with Qual Researchers, Engineers and Product Managers to drive metrics within the product and build personalized in app experiences. In recent times, Product Led Growth (PLG) initiatives has resulted in a positive shift in working paradigm between Product Managers and Data Scientists. In this talk, I will begin with PLG, what it means and the impacts it has in almost all the big tech products and services. I will share few algorithms, operating models for successful PLG motions in large tech companies. I will also go over how modern user segmentation requires data skills and subject matter expertise, along with talking about how it gets deployed for personalization use cases.
PRO Workshop (AI): Sparsity without Sacrifice – How to Accelerate AI Models Without Losing Accuracy
Most companies with AI models in production today are grappling with stringent latency requirements and escalating energy costs. One way to reduce these burdens is by pruning such models to create sparse lightweight networks. Pruning involves the iterative removal of weights from a pre-trained dense network to obtain a network with fewer parameters, trading off against model accuracy. Determining which weights should be removed in order to minimize the impact to the network’s accuracy is critical. For real-world networks with millions of parameters, however, analytical determination is often computationally infeasible; heuristic techniques are a compelling alternative.In this presentation, we talk about how to implement commonly-used heuristics such as gradual magnitude pruning (GMP) in production, along with their associated accuracy-speed trade offs, using the BERT family of language models as an example.Next, we cover ways of accelerating such lightweight networks to achieve peak computational efficiencies and reduce energy consumption. We walk through how our acceleration algorithms optimize hardware efficiency, unlocking order-of-magnitude speedups and energy savings.Finally, we present best practices on how these techniques can be combined to achieve multiplicative effects in reducing energy consumption costs and runtime latencies without sacrificing model accuracy.
PRO Workshop (AI): Scaling ML Embedding Models to Serve a Billion Queries
This talk is aimed at providing a deeper insight into the scale, challenges and solutions formulated for powering embeddings based visual search in eBay. This talk walks the audience through the model architecture, application archite for serving the users, the workflow pipelines produced for building the embeddings to be used by Cassini, eBay's search engine and the unique challenges faced during this journey. This talk provides key insights specific to embedding handling and how to scale systems to provide real time clustering based solutions for users.
PRO Workshop (AI): Artificial General Intelligence with GPT-3 with Open AI
Large Language Models (LLM) have come out of the realm of academia and research and become available to average development teams thanks to the efforts of Open AI and their competitors. Now that we have access to them what can we do with them?
This talk will explore some of the practical uses for GPT-3 made available through Open AI. We will start with a brief introduction to LLM's and transformers and how they bring us a step closer to artificial general intelligence. We will focus on real demonstrations. Each capability will start with a canned demonstration and move on to ad hoc input provided by the audience.
• Text Generation
○ Turn complex text into a simple summary
○ Create an outline of an essay
• Conversation
○ Sarcastic chat bot
• Code Generation
○ Explain Python Code
○ Translate text into programmatic commands
• Question Answering
○ Factual Answering
You will leave this talk with an understanding of Large Language Models and their practical use cases. Walk away inspired on how to apply large language models to your business today!
Wednesday, October 26, 2022
PRO TALK (AI): ML Drift Monitoring : What to Observe, How to Analyze & When to Act
Deploying a new ML model in production successfully is a great achievement, but also is the beginning of a persistent challenge to keep them performing at expected levels. Models in product will drift and decay, and the value provided by them to the business will drop. ML drift monitoring is a challenging tasks, from identifying the right data to collect, the right metrics to compute, the right trends to analyze and the right actions to take. This session will explore the process of model drift monitoring, from model instrumentation to determining the next-best-action. Real life challenges will be explored and best practices and recommendations will be discussed.
OPEN TALK (AI): Lessons Learned Building Natural Language Systems in Healthcare
This session reviews case studies from real-world projects that built AI systems that use Natural Language Processing (NLP) in healthcare. These case studies cover projects that deployed automated patient risk prediction, automated diagnosis, clinical guidelines, and revenue cycle optimization.
We will cover why and how NLP was used, what deep learning models and libraries were used, how transfer learning enables tuning accurate models from small datasets, and what was productized and achived. Key takeaways for attendees will include applicable best practices for NLP projects including how to build domain-specific healthcare models and using NLP as part of larger machine learning and deep learning pipelines.
KEYNOTE (AI): LivePerson -- Building a Mental Model Around Conversational AI: Why We Need to Teach How to Interact with Bots
Use of conversational AI across retail, finance, healthcare, and other industries is on the rise. Whether they recognize it or not, today’s consumers are rapidly shifting their mindset — they are ready for, and even demand, a new type of interaction with brands centered around messaging: Indeed, new research shows that over 3 quarters (78%) of consumers want the ability to message with businesses and 83% would browse or buy products in messaging conversations.
Perhaps most importantly, consumers are suddenly, radically more open to automated conversations now than ever before: Positive sentiment towards chatbots nearly doubled in 2021 (61%) vs in 2020 (31%).
Despite new capabilities that make chatting with a conversational AI bot more like having a conversation with a human, there isn’t yet a prevailing mental model for what conversational AI is that will help people get the most out of their interactions with them. Simply put, people aren’t sure how to talk to bots. On the one hand, some people treat it like a search engine, typing in short commands; while others treat it like another human, telling long-winded stories and burying what question or issue it is they really are trying to address.
Similar to when search engines were first invented and people had to figure out how to effectively use them, many people may not know how to maximize the efficiency of a bot conversation. Tech companies can and must take the lead on that instruction to enable correct use of their products and to help users get the most benefit out of them.
During this session, Joe Bradley will offer guidance on how companies can help users find the middle ground of these two scenarios. How they can begin creating a playbook for cultivating best practices and interacting with conversational AI.
There are many questions around how companies should teach people to interact with conversational AI and how they can make this form of communication most successful that are just now being explored – How can we be sensitive to the fact that different people will respond to conversational AI in different ways? How can we help people learn and get the most out of this new type of interaction? Not only do these questions intersect with machine learning but they also involve psychology and sociology.
While few people have the time (or interest) in diving deep on how to best interact with conversational AI, bot builders can begin to offer clues and guidance on how to engage with conversational AI bots effectively. Having previously worked on data science and e-commerce projects at Amazon and Nike and advising brands like David’s Bridal and Virgin Atlantic at LivePerson on how to build their bot strategies, Joe Bradley will share his learnings on how to build a mental model around conversational AI that gets the most out of this increasingly used form of interaction.
OPEN TALK (AI): Deep Dive on Creating a Photorealistic Talking Avatar
Creating a photorealistic avatar speaking any sentence starting from a written input text.
Focusing on autoencoders, we will do a journey from the beginning (Of the speaker experience), mistakes and tips learned along the path.
Will be showcased:
- Intro, the timeline from beginning to nowadays
- Is NOT a deepfake
- Audio processing techniques: STFT (Short Term Fourier Transform), MELs and custom solutions
- Deeplearning models and architecture
- The technique, inspired to inpaiting, used to animate the mouth
- Masks and convolution
- Landmarks extraction
- Morphing animation technique based on autoencoders features
- Microsoft Azure Speech services used to support audio and animation processing
- Putting all together
PRO TALK (AI): Data Ecosystem a Stepping Stone for Decarbonization of Operation Industry
Climate change is possibly one of the most complex and challenging issues on earth. On the other hand, manufacturing companies often find themselves in the crosswind of it. Oil and gas, mining, chemical, cement, energy, and utility sectors are responsible for more than 50% of the industrial GHG emissions. The changes they are bringing into their operations are not enough to address the issue. New initiatives for carbon abetment are not showing any visible improvement in reducing GHG levels in the environment.
In this session, we will analyze how data ecosystems such as LiDAR, remote-sensing data, IT, and OT data pertinent to these manufacturing companies can help them to track/measure, trace and mitigate excess emission issues for their operations. We will also explore how advanced AI techniques such as deep learning, and reinforcement learning techniques can be used effectively to find an optimal solution for the above-mentioned problem/s with real-life examples.
OPEN TALK (AI): How To Build An AI Based Knowledge Graph for Customers in Fintech
In this session, we’d go through our journey to build an AI based Customer Knowledge graph. We’d share the insights & knowhow required to create this scalable & polyglot data platform. Join us to learn the design patterns & best practices that we have developed over time to create an intelligent solution based on AI & Graph technologies for an ever increasing list of product lines and customers.
OPEN TALK (AI): Patenting Artificial Intelligence– How AI Companies Can Identify and Protect AI Inventions
Artificial intelligence is becoming one of the most widespread and useful technologies in use today. From data collection to model training, language processing to predictive models, deep networks to AI frameworks, there are many categories and implementations of AI, all with protectable features and important business applications. Protecting cutting edge AI technology helps companies achieve business goals and support their AI innovation.
This presentation will identify key strategies to identify which aspects of AI are patentable and which aspects are not. The discussed strategies will be supplemented with practical real-world examples of patenting different areas of the AI process, from data collection to model training and model implementation to output applications, as well as distinct types of AI systems.
Attendees will also learn about AI patent trends and the most common use cases in which different AI companies build valuable patent portfolios around their AI technology.
OPEN TALK (AI): Scalable, Explainable and Unsupervised Anomaly Detection for Telecom
In developing and implementing a telecommunications network, one of the most oppressive challenges that these companies deal with are anomalies that occur within the network showing that something strange (usually an attack, a fraud or an error) is happening. Detecting these anomalies is a challenge because they may appear in different places and formats and require the observation of multiple metrics over hundreds of thousands of events to tell regular behaviors from anomalous ones. Ivan Carmello De Andrade, would like to explain how detecting these anomalies with higher accuracy may be possible with the technology and machine learning capabilities of today.
In his technical session, Ivan will explain how he and his team were able to customize and adapt a Robust Random Cut Forest model to identify and explain anomalies in an unsupervised and scalable way. He and his team will explain the process behind creating this solution as well as the challenges they overcame in development, such as extracting behaviors from individual events. He will also explain the benefit of this model to the user which include:
• The user does not need to understand which behaviors are regular or anomalous nor which features are relevant to describe and identify them
• The model provides accountability, because the user can identify and understand which factors lead to an event being identified as an anomaly
• Scalability in general, the model can be implemented on many different scales with a highly distributable structure and configurable levels of detail
OPEN TALK (AI): Pushing Deepfakes to the Limit - Fake Video Calls with AI
Today's real-time Deepfake technology makes it possible to create indistinguishable doppelgängers of a person and let them participate in video calls. Since 2019, the TNG Innovation Hacking Team has intensively researched and continuously developed the AI around real-time Deepfakes. The final result and the individual steps towards photorealism will be presented in this talk.
Since its first appearance in 2017, Deepfakes have evolved enormously from an AI gimmick to a powerful tool. Meanwhile different media outlets such as "Leschs Kosmos", Galileo and other television formats have been using TNG Deepfakes.
In this talk we will show the different evolutionary steps of the Deepfake technology, starting with the first Deepfakes and ending with real-time Deepfakes of the entire head in high resolution. Several live demos will shed light on individual components of the software. In particular, we focus on various new technologies to improve Deepfake generation, such as Tensorflow 2 and MediaPipe, and the differences in comparison to our previous implementations.
OPEN TALK (AI): Democratizing Deep Learning with Vector Similarity Search
Deep learning is responsible for most of the breakthroughs we have seen in AI/ML in recent years, yet most companies' models in production use classic or traditional ML. In this talk we will explore how deep learning is being democratized today, thanks to the rising use and availability of vector embeddings from giant pre-trained neural networks. We will see how these embeddings can be combined together with vector similarity search to address different use cases covering any modality and applied to any type of object. Finally, we will discuss the many opportunities this presents as well as the tools that are required to successfully deploy these applications into production.
Thursday, October 27, 2022
KEYNOTE (AI): Iterate.ai -- AI Will Fuel 2023’s Innovation Explosion – What Can You Do Now?
2023 is the inflection point when a matured $98 billion AI market defines a truly new age of innovation for enterprises across industries. The convergence of several maturing technologies all now steering toward 2023 ubiquity – including 5G, IoT, blockchain, and low-code software platforms – will enable AI technologies to fast-track innovation to a degree that enterprises haven’t yet seen and enable wholly new customer experiences. Enterprises proficient with AI going into 2023 will wield a decisive competitive advantage; what do they need to be doing now?
Enterprises have just a one-year head start to prepare for the explosion in innovation that demonstrably more matured AI, combined with several other advances, will unlock. This talk offers attendees a crucial opportunity to understand the coming AI-led transformation, why 2023 is pivotal, and how to take steps now that position their businesses at the leading edge of these uniquely profound market changes.
Attendees of this presentation will come away with a clear picture of how AI will transform enterprise innovation, the advantages available to those that prepare appropriately, and how to accelerate AI strategies within their organizations. IDC predicts that once AI hits scale, AI-powered businesses will respond to customers and competitors 50% faster than competitors. Powered by tiny powerful AI chips – 50 can now fit on the head of a penny – products and sensors with localized edge-processing capabilities will do their own thinking. Countless AI interactions will contribute data in real-time, enabling new product experiences, rapid iteration of software solutions using low-code drag-and-drop development, IoT-powered backend and supply chain efficiency, and blockchain-secured digital identities and privacy. Ultimately, enterprises that take steps to become AI-ready today will command greater customer satisfaction and success tomorrow.
FEATURED TALK (AI): Circumventing Scripting: Automating Conversation Design
Whether building a chatbot with or without code, the scripting process remains a behemoth task. We're looking at all the ways Conversational Design can be automated, to make building a chatbot script less burdensome and open up the field to creative users who can help exponentially expand chatbot use cases. At BOTS, we strive to get creative users building chatbots and A.I. solutions regardless of background. This year, we launched a STEM version in the schools where students in K-5 built their own chatbots to support their lesson and learn about A.I.
OPEN TALK (AI): Shift Left Strategy to Enable Autonomous Data Science
Data Science is hard, achieving ROI from your AI projects is even harder. Data Scientists spend more time wrangling data and slinging models to software and devops engineers than time developing and analyzing their ML models. The solution is to enable a culture shift similar to the DevOps movement where developers manage software quality in production - data scientists should manage ML model performance in production environments. Dedicated ML Engineers are helping to bridge this transition, but they struggle with the tools and automations required to enable scale with autonomy.
Join Manish Modh, Founder & CEO of Andromeda 360 AI on this journey to envision a world of autonomous data science and how Data Scientists and ML Engineers are empowered to own the development, deployment, operations, and performance of their machine learning use cases. Experience the challenges data science teams face today and why most AI projects fail. Learn the art of the possible that leverages all of the wisdom gathered over 20 years of technology evolution from Big Data, Cloud, DevSecOps, AI/ML, and Edge computing
PRO TALK (AI): Physics-Based Graph Neural Networks Enable Composable, Strongly Typed Neural Networks
PassiveLogic’s (www.passivelogic.com) platform for generalized autonomy utilizing Deep Digital Twins is built on systems-level control theory. The platform is generalized because it can be used to control any kind of system. At its core, this type of platform works on the sensor-fusion and control-fusion of digital models. In these Deep Digital Twin models, the digital twin literally is the AI structure. Each digital twin utilizes the fundamentals of physics to model a single component or piece of equipment. When multiple digital twins are linked to each other in a graph neural network, they form a system description. Because their physics are integral to the models themselves, these graph-based system descriptions model not only the real complexities of systems but also their emergent behavior and the system semantics.
Deep physics networks are structured similar to neural networks, but unlike the homogeneous activation functions of neural nets, each neuron comprises unique physical equations representing a function in a thermodynamic system. The Deep Physics approach is built on heterogeneous neural nets that are composable, have physics guarantees, allow users to define their own systems, learn unsupervised, and generate a physics description of a system. Being so principled, it is also necessarily more constrained, meaning the physics-based graph neural networks can be used to predict future system behavior.
The physics-based graph neural network provides a systems-level intelligence as it understands the interconnectivity of components in a system. As such, it can automatically infer behavior and introspect results, even where sensors do not exist. Using this inference ability, an autonomous control platform built on Deep Digital Twins can provide self-commissioning, automate point-mapping, validate installation, and provide continuous system measurement and verification against its original design. Real-time system operational data can be brought into the model for real-time machine learning so that the model can adapt for improved accuracy of predicting the system behavior.
In this talk, Troy Harvey, CEO at PassiveLogic, will describe Deep Digital Twin AI structures and the applications for generalized autonomy.
OPEN TALK (AI): Operationalizing AI with a Shift from Research to Product Orientation
Many AI programs fail to deliver sustained value despite great research, due to insufficient operational tools, processes and practices. These days, more and more data science teams are going through a major shift, from research orientation, to product orientation. Key factors to successfully transition to a product-oriented approach to AI include empowering data scientists to take end to end accountability for model performance, and going beyond the model - gaining a granular understanding of the behavior of the entire AI-driven process. In this talk, Yotam will discuss the importance of empowering data science teams to successfully make the transition from research oriented to product oriented.
OPEN TALK (AI): Conversational AI Solutions for the Metaverse of Work
Is your enterprise ready to engage its customers and employees in new immersive experiences powered by web3 and the Metaverse. With Facebook's Horizons and Microsoft's Teams making significant product investments into creating underlying Metaverse Platforms for enterprises to launch both employee and customer-facing experiences, organizations would need tailored conversational strategies and specialized tools to drive effective engagement on these evolving Metaverse platforms . This session will explore the critical role of Conversational AI technologies in creating effective Metaverse solutions and experiences, and also address the key considerations for conversational AI in applications of Metaverse technologies for improving work productivity, deploying interactive learning environments, and powering e-commerce.
OPEN TALK (AI): Level Up Your Data Lake - to ML and Beyond
A data lake is primarily two things: an object store and the objects being stored. Even with the most basic setup, data lakes are capable of supporting BI, Machine Learning, and operational analytics use cases. This flexibility speaks to the strength of object stores, particularly their flexibility in integrating with a diverse set of data processing engines.
As data lakes exploded in adoption, a number of improvements were made to the first architectures. The first and most obvious improvement was to file formats, which led to the development of analytics-optimized formats like parquet, and eventually modern table formats.
An even newer improvement has been the emergence of data source control tools that bring new levels of manageability across an entire lake! In this talk, we'll cover how to incorporate these technologies into your data lake, and how they simplify workflows critical to ML experimentation, deployment of datasets, and more!
OPEN TALK (AI): Reducing Latency and Resource Consumption for Offline Feature Generation
Personalization is one of the key pillars of Netflix as it enables each member to experience the vast collection of content tailored to their interests. Our personalization system is powered by various machine learning models. We constantly innovate by adding new features to our personalization models and running A/B tests to improve recommendations for our members. We also continue to see that providing larger training sets to our models helps make better predictions. Our ML fact store has enabled us to provide larger training sets where the training set spans over a long time window. While a great success, the ML fact store architecture has its limitations. For example, features computed while generating recommendations must be recomputed by offline feature generation pipelines. This talk is about those limitations and how we enhanced our architecture to run optimized offline feature generation pipelines.
OPEN TALK (AI): Bringing Life and Motion to AI Explainability
SHAP is a great tool to help developers and users understand black box models. To push it to the next level, we will show how to leverage on Dash, SHAP, gifs, and auto-encoders to generate interactive dashboards with animations and visual representations to understand how different AI models learn and change their minds while progressively trained with growing amounts of data.
Animations will help developers understand how frequently AI models tweak their population and local importance factors during training and how they compare across competing AI models, adding an extra layer to AI safety. Auto-encoders and LSTM will be used to generate 2-dimensional embedding representations of explainability paths at individual level, allowing developers to interactively detect algorithm decision making similarity across time and visually debug mislabeled AI predictions at each point in time.
We will show this application in the context of Chronic Kidney Disease prediction and broader Healthcare AI.
Tuesday, November 1, 2022
[#VIRTUAL] PRO Workshop (AI): How Route Optimisation Can Be Scaled and Optimised Using Meta Heuristics for Realistic Scenario
Join on HopinECommerce platforms drive the current era, and the COVID pandemic gave rise to the need for home delivery. The end consumers have multiple options to cater for their needs, and in that case, the eCommerce platforms have to provide on-time and quality delivery to stay ahead in the market and, at the same time, boost their profit margins.
Route Optimization is one of the most critical aspects of planning and transportation. It ensures that deliveries always arrive on time and carry out with the lowest possible cost and energy consumption. However, there are a lot of variables that eCommerce platforms need to consider in a real-time scenario.
During this unfortunate COVID pandemic, eCommerce platforms deal with a massive inflow of e-commerce orders from customers scattered throughout a city, country or even across the globe. This gives rise to an enormous number of variables come into play that cannot be solved using conventional methods in a reasonable amount of time. With the recent developments in AI, machine learning and cloud data, the entire game of route optimization has begun to change. AI continuously retrieves data, learns from it, and searches for improved methods to ensure the most optimal routes for the drivers.
In the novel solution, we are trying to solve the multi-objective vehicle routing problem with optimization variables like minimizing the delivery cost, the number of vehicles and delivery time. To show this as a real-life simulation, we will dissect through the open-source library of veroviz combined with innovative scaling solutions to showcase the real-time implementation of route optimization in any part of the world.
[#VIRTUAL] PRO Workshop (AI): Sparsity without Sacrifice – How to Accelerate AI Models Without Losing Accuracy
Join on HopinMost companies with AI models in production today are grappling with stringent latency requirements and escalating energy costs. One way to reduce these burdens is by pruning such models to create sparse lightweight networks. Pruning involves the iterative removal of weights from a pre-trained dense network to obtain a network with fewer parameters, trading off against model accuracy. Determining which weights should be removed in order to minimize the impact to the network’s accuracy is critical. For real-world networks with millions of parameters, however, analytical determination is often computationally infeasible; heuristic techniques are a compelling alternative.In this presentation, we talk about how to implement commonly-used heuristics such as gradual magnitude pruning (GMP) in production, along with their associated accuracy-speed trade offs, using the BERT family of language models as an example.Next, we cover ways of accelerating such lightweight networks to achieve peak computational efficiencies and reduce energy consumption. We walk through how our acceleration algorithms optimize hardware efficiency, unlocking order-of-magnitude speedups and energy savings.Finally, we present best practices on how these techniques can be combined to achieve multiplicative effects in reducing energy consumption costs and runtime latencies without sacrificing model accuracy.
[#VIRTUAL] PRO Workshop (AI): Scaling ML Embedding Models to Serve a Billion Queries
Join on HopinThis talk is aimed at providing a deeper insight into the scale, challenges and solutions formulated for powering embeddings based visual search in eBay. This talk walks the audience through the model architecture, application archite for serving the users, the workflow pipelines produced for building the embeddings to be used by Cassini, eBay's search engine and the unique challenges faced during this journey. This talk provides key insights specific to embedding handling and how to scale systems to provide real time clustering based solutions for users.
[#VIRTUAL] PRO Workshop (AI): Artificial General Intelligence with GPT-3 with Open AI
Join on HopinLarge Language Models (LLM) have come out of the realm of academia and research and become available to average development teams thanks to the efforts of Open AI and their competitors. Now that we have access to them what can we do with them?
This talk will explore some of the practical uses for GPT-3 made available through Open AI. We will start with a brief introduction to LLM's and transformers and how they bring us a step closer to artificial general intelligence. We will focus on real demonstrations. Each capability will start with a canned demonstration and move on to ad hoc input provided by the audience.
• Text Generation
○ Turn complex text into a simple summary
○ Create an outline of an essay
• Conversation
○ Sarcastic chat bot
• Code Generation
○ Explain Python Code
○ Translate text into programmatic commands
• Question Answering
○ Factual Answering
You will leave this talk with an understanding of Large Language Models and their practical use cases. Walk away inspired on how to apply large language models to your business today!
Wednesday, November 2, 2022
[#VIRTUAL] PRO TALK (AI): ML Drift Monitoring : What to Observe, How to Analyze & When to Act
Join on HopinDeploying a new ML model in production successfully is a great achievement, but also is the beginning of a persistent challenge to keep them performing at expected levels. Models in product will drift and decay, and the value provided by them to the business will drop. ML drift monitoring is a challenging tasks, from identifying the right data to collect, the right metrics to compute, the right trends to analyze and the right actions to take. This session will explore the process of model drift monitoring, from model instrumentation to determining the next-best-action. Real life challenges will be explored and best practices and recommendations will be discussed.
[#VIRTUAL] OPEN TALK (AI): Lessons Learned Building Natural Language Systems in Healthcare
Join on HopinThis session reviews case studies from real-world projects that built AI systems that use Natural Language Processing (NLP) in healthcare. These case studies cover projects that deployed automated patient risk prediction, automated diagnosis, clinical guidelines, and revenue cycle optimization.
We will cover why and how NLP was used, what deep learning models and libraries were used, how transfer learning enables tuning accurate models from small datasets, and what was productized and achived. Key takeaways for attendees will include applicable best practices for NLP projects including how to build domain-specific healthcare models and using NLP as part of larger machine learning and deep learning pipelines.
[#VIRTUAL] KEYNOTE (AI): LivePerson -- Building a Mental Model Around Conversational AI: Why We Need to Teach How to Interact with Bots
Join on HopinUse of conversational AI across retail, finance, healthcare, and other industries is on the rise. Whether they recognize it or not, today’s consumers are rapidly shifting their mindset — they are ready for, and even demand, a new type of interaction with brands centered around messaging: Indeed, new research shows that over 3 quarters (78%) of consumers want the ability to message with businesses and 83% would browse or buy products in messaging conversations.
Perhaps most importantly, consumers are suddenly, radically more open to automated conversations now than ever before: Positive sentiment towards chatbots nearly doubled in 2021 (61%) vs in 2020 (31%).
Despite new capabilities that make chatting with a conversational AI bot more like having a conversation with a human, there isn’t yet a prevailing mental model for what conversational AI is that will help people get the most out of their interactions with them. Simply put, people aren’t sure how to talk to bots. On the one hand, some people treat it like a search engine, typing in short commands; while others treat it like another human, telling long-winded stories and burying what question or issue it is they really are trying to address.
Similar to when search engines were first invented and people had to figure out how to effectively use them, many people may not know how to maximize the efficiency of a bot conversation. Tech companies can and must take the lead on that instruction to enable correct use of their products and to help users get the most benefit out of them.
During this session, Joe Bradley will offer guidance on how companies can help users find the middle ground of these two scenarios. How they can begin creating a playbook for cultivating best practices and interacting with conversational AI.
There are many questions around how companies should teach people to interact with conversational AI and how they can make this form of communication most successful that are just now being explored – How can we be sensitive to the fact that different people will respond to conversational AI in different ways? How can we help people learn and get the most out of this new type of interaction? Not only do these questions intersect with machine learning but they also involve psychology and sociology.
While few people have the time (or interest) in diving deep on how to best interact with conversational AI, bot builders can begin to offer clues and guidance on how to engage with conversational AI bots effectively. Having previously worked on data science and e-commerce projects at Amazon and Nike and advising brands like David’s Bridal and Virgin Atlantic at LivePerson on how to build their bot strategies, Joe Bradley will share his learnings on how to build a mental model around conversational AI that gets the most out of this increasingly used form of interaction.
[#VIRTUAL] PRO TALK (AI): Data Ecosystem a Stepping Stone for Decarbonization of Operation Industry
Join on HopinClimate change is possibly one of the most complex and challenging issues on earth. On the other hand, manufacturing companies often find themselves in the crosswind of it. Oil and gas, mining, chemical, cement, energy, and utility sectors are responsible for more than 50% of the industrial GHG emissions. The changes they are bringing into their operations are not enough to address the issue. New initiatives for carbon abetment are not showing any visible improvement in reducing GHG levels in the environment.
In this session, we will analyze how data ecosystems such as LiDAR, remote-sensing data, IT, and OT data pertinent to these manufacturing companies can help them to track/measure, trace and mitigate excess emission issues for their operations. We will also explore how advanced AI techniques such as deep learning, and reinforcement learning techniques can be used effectively to find an optimal solution for the above-mentioned problem/s with real-life examples.
[#VIRTUAL] PRO TALK (AI): Designing and Applying AI on Large Volume of IOT Telemetry Data in Azure
Join on HopinIOT devices are producing large amount of telemetry data. We need to ingest them, store them, visualize them, analyze them and build ML models on them to make those data useful. In this session, we will talk about the ways to deal with IOT data, using IOT data to train AI models, building AI models and deploying AI models to inference on real time large volume of IOT data using Azure AI tools.
[#VIRTUAL] PRO Workshop (AI): Intro to Machine Learning with ML.NET
Join on HopinCome and get immersed into the world of machine learning with this introduction and demonstration of to ML.NET. We'll show how to create an app that can predict the type of iris flower based on features such as petal length. We'll show how to download and install ML.NET, create a data set, write the required c# code and run the finished app.
[#VIRTUAL] PRO Workshop (AI): Deploying Machine Learning Models with Pulsar Functions
Join on HopinIn this talk I will present a technique for deploying machine learning models to provide real-time predictions using Apache Pulsar Functions. In order to provide a prediction in real-time, the model usually receives a single data point from the caller, and is expected to provide an accurate prediction within a few milliseconds.
Throughout this talk, I will demonstrate the steps required to deploy a fully-trained ML that predicts the delivery time for a food delivery service based upon real-time traffic information, the customer's location, and the restaurant that will be fulfilling the order.
[#VIRTUAL] OPEN TALK (AI): How To Build An AI Based Knowledge Graph for Customers in Fintech
Join on HopinIn this session, we’d go through our journey to build an AI based Customer Knowledge graph. We’d share the insights & knowhow required to create this scalable & polyglot data platform. Join us to learn the design patterns & best practices that we have developed over time to create an intelligent solution based on AI & Graph technologies for an ever increasing list of product lines and customers.
[#VIRTUAL] OPEN TALK (AI): It’s an AI Product Manager’s Job to Help an Organization Succeed with Predictive Machine Learning
Join on HopinIn short, AI is a lifecycle that requires the integration of data, machine learning models, and the software around it. It covers everything from scoping and designing to building and testing all the way through to deployment — and eventually requires frequent monitoring. Product managers need to ensure that data scientists are delivering results in efficient ways so business counterparts can understand, interpret, and use it to learn from. This includes everything from the definition of the problem, the coverage and quality of the data set and its analysis, to the presentation of results and the follow-up.
[#VIRTUAL] OPEN TALK (AI): Patenting Artificial Intelligence– How AI Companies Can Identify and Protect AI Inventions
Join on HopinArtificial intelligence is becoming one of the most widespread and useful technologies in use today. From data collection to model training, language processing to predictive models, deep networks to AI frameworks, there are many categories and implementations of AI, all with protectable features and important business applications. Protecting cutting edge AI technology helps companies achieve business goals and support their AI innovation.
This presentation will identify key strategies to identify which aspects of AI are patentable and which aspects are not. The discussed strategies will be supplemented with practical real-world examples of patenting different areas of the AI process, from data collection to model training and model implementation to output applications, as well as distinct types of AI systems.
Attendees will also learn about AI patent trends and the most common use cases in which different AI companies build valuable patent portfolios around their AI technology.
[#VIRTUAL] KEYNOTE (AI): Snowflake -- Training, Deploying, and Running a ML model using Python and Snowpark
Join on HopinIn this session, we will train a Linear Regression model to predict future ROI (Return On Investment) of variable advertising spend budgets across multiple channels including search, video, social media, and email using Snowpark for Python and scikit-learn. By the end of the session, you will have an interactive web application deployed visualizing the ROI of different allocated advertising spend budgets.
[#VIRTUAL] OPEN TALK (AI): Scalable, Explainable and Unsupervised Anomaly Detection for Telecom
Join on HopinIn developing and implementing a telecommunications network, one of the most oppressive challenges that these companies deal with are anomalies that occur within the network showing that something strange (usually an attack, a fraud or an error) is happening. Detecting these anomalies is a challenge because they may appear in different places and formats and require the observation of multiple metrics over hundreds of thousands of events to tell regular behaviors from anomalous ones. Ivan Carmello De Andrade, would like to explain how detecting these anomalies with higher accuracy may be possible with the technology and machine learning capabilities of today.
In his technical session, Ivan will explain how he and his team were able to customize and adapt a Robust Random Cut Forest model to identify and explain anomalies in an unsupervised and scalable way. He and his team will explain the process behind creating this solution as well as the challenges they overcame in development, such as extracting behaviors from individual events. He will also explain the benefit of this model to the user which include:
• The user does not need to understand which behaviors are regular or anomalous nor which features are relevant to describe and identify them
• The model provides accountability, because the user can identify and understand which factors lead to an event being identified as an anomaly
• Scalability in general, the model can be implemented on many different scales with a highly distributable structure and configurable levels of detail
[#VIRTUAL] OPEN TALK (AI): Pushing Deepfakes to the Limit - Fake Video Calls with AI
Join on HopinToday's real-time Deepfake technology makes it possible to create indistinguishable doppelgängers of a person and let them participate in video calls. Since 2019, the TNG Innovation Hacking Team has intensively researched and continuously developed the AI around real-time Deepfakes. The final result and the individual steps towards photorealism will be presented in this talk.
Since its first appearance in 2017, Deepfakes have evolved enormously from an AI gimmick to a powerful tool. Meanwhile different media outlets such as "Leschs Kosmos", Galileo and other television formats have been using TNG Deepfakes.
In this talk we will show the different evolutionary steps of the Deepfake technology, starting with the first Deepfakes and ending with real-time Deepfakes of the entire head in high resolution. Several live demos will shed light on individual components of the software. In particular, we focus on various new technologies to improve Deepfake generation, such as Tensorflow 2 and MediaPipe, and the differences in comparison to our previous implementations.
[#VIRTUAL] OPEN TALK (AI): The Enterprise Ready Feature Store: Scaling your Feature Store for Real-time AI/ML
Join on HopinNo longer considered a new concept, ML Feature Stores have existed for several years now, becoming the cornerstone of MLOps platforms. Today, with the rise of Real-time AI and the wide span of AI/ML use cases they enable, It's no wonder then that some companies are already outgrowing their existing Feature Stores. This talk is both for those who are new to Feature Stores and those looking to scale or upgrade their existing implementation. It will explore how to make sure your Feature Store is both future proof and enterprise-ready across supported ML feature types, advanced functionalities as well as infrastructure and operational considerations required to cost-effectively deliver real-time AI/ML use cases with low latency at scale. This talk will cover a range of approaches including building your own feature store, using open source products such as Feast of Feathr, or opting for a commercial Feature Store implementation. Each option will be considered also in the context of the rise of real-time AI and the specific challenges that it creates.
Thursday, November 3, 2022
[#VIRTUAL] KEYNOTE (AI): Indico Data - Unstructured Data: Challenge and Opportunity for the AI Developer
Join on HopinUnstructured Data represents a massive and little explored frontier for both the enterprise and the enterprise technology professional. The dizzying proliferation of tools for programatically working with documents, audio, images and video (as well as the corresponding hype) can be overwhelming. This session will provide a practical framework for breaking down the analysis and automation of unstructured data stores and flows, as well as a survey of success stories.
[#VIRTUAL] OPEN TALK (AI): Deep Dive on Creating a Photorealistic Talking Avatar
Join on HopinCreating a photorealistic avatar speaking any sentence starting from a written input text.
Focusing on autoencoders, we will do a journey from the beginning (Of the speaker experience), mistakes and tips learned along the path.
Will be showcased:
- Intro, the timeline from beginning to nowadays
- Is NOT a deepfake
- Audio processing techniques: STFT (Short Term Fourier Transform), MELs and custom solutions
- Deeplearning models and architecture
- The technique, inspired to inpaiting, used to animate the mouth
- Masks and convolution
- Landmarks extraction
- Morphing animation technique based on autoencoders features
- Microsoft Azure Speech services used to support audio and animation processing
- Putting all together
[#VIRTUAL] KEYNOTE (AI): Iterate.ai - AI Will Fuel 2023’s Innovation Explosion – What Can You Do Now?
Join on Hopin2023 is the inflection point when a matured $98 billion AI market defines a truly new age of innovation for enterprises across industries. The convergence of several maturing technologies all now steering toward 2023 ubiquity – including 5G, IoT, blockchain, and low-code software platforms – will enable AI technologies to fast-track innovation to a degree that enterprises haven’t yet seen and enable wholly new customer experiences. Enterprises proficient with AI going into 2023 will wield a decisive competitive advantage; what do they need to be doing now?
Enterprises have just a one-year head start to prepare for the explosion in innovation that demonstrably more matured AI, combined with several other advances, will unlock. This talk offers attendees a crucial opportunity to understand the coming AI-led transformation, why 2023 is pivotal, and how to take steps now that position their businesses at the leading edge of these uniquely profound market changes.
Attendees of this presentation will come away with a clear picture of how AI will transform enterprise innovation, the advantages available to those that prepare appropriately, and how to accelerate AI strategies within their organizations. IDC predicts that once AI hits scale, AI-powered businesses will respond to customers and competitors 50% faster than competitors. Powered by tiny powerful AI chips – 50 can now fit on the head of a penny – products and sensors with localized edge-processing capabilities will do their own thinking. Countless AI interactions will contribute data in real-time, enabling new product experiences, rapid iteration of software solutions using low-code drag-and-drop development, IoT-powered backend and supply chain efficiency, and blockchain-secured digital identities and privacy. Ultimately, enterprises that take steps to become AI-ready today will command greater customer satisfaction and success tomorrow.
[#VIRTUAL] PRO TALK (AI): Avoid Mistakes Building AI Products
Based on Gartner's research, 85% of AI projects fail. In this talk, we show the most typical mistakes made by the managers, developers, and data scientists that might make the product fail. We base on ten case studies of products that failed and explain the reasons for each fail. On the other hand, we show how to avoid such mistakes by introducing a few lifecycle changes that make an AI product more probable to succeed.
[#VIRTUAL] FEATURED TALK (AI): Circumventing Scripting: Automating Conversation Design
Join on HopinWhether building a chatbot with or without code, the scripting process remains a behemoth task. We're looking at all the ways Conversational Design can be automated, to make building a chatbot script less burdensome and open up the field to creative users who can help exponentially expand chatbot use cases. At BOTS, we strive to get creative users building chatbots and A.I. solutions regardless of background. This year, we launched a STEM version in the schools where students in K-5 built their own chatbots to support their lesson and learn about A.I.
[#VIRTUAL] OPEN TALK (AI): Shift Left Strategy to Enable Autonomous Data Science
Join on HopinData Science is hard, achieving ROI from your AI projects is even harder. Data Scientists spend more time wrangling data and slinging models to software and devops engineers than time developing and analyzing their ML models. The solution is to enable a culture shift similar to the DevOps movement where developers manage software quality in production - data scientists should manage ML model performance in production environments. Dedicated ML Engineers are helping to bridge this transition, but they struggle with the tools and automations required to enable scale with autonomy.
Join Manish Modh, Founder & CEO of Andromeda 360 AI on this journey to envision a world of autonomous data science and how Data Scientists and ML Engineers are empowered to own the development, deployment, operations, and performance of their machine learning use cases. Experience the challenges data science teams face today and why most AI projects fail. Learn the art of the possible that leverages all of the wisdom gathered over 20 years of technology evolution from Big Data, Cloud, DevSecOps, AI/ML, and Edge computing
[#VIRTUAL] PRO TALK (AI): Leveraging Automated Machine Learning to Enable Anyone to Develop Machine Learning Solutions
Join on HopinNowadays, several business owners know that leveraging Artificial Intelligence capabilities, on their systems and applications, can enable their businesses to achieve better results. But building Artificial Intelligence solutions may be a time-consuming and complex process, so consequently, some of these people give up of building such solutions, since they or their team do not have the expertise and capacity required, or sometimes they end-up paying to third-party companies to build these solutions and as a consequence, they end-up doing a significant investment on building these solutions. Azure Automated Machile Learning is the solution to enable anyone to build the Artificial Intelligence and Machine Learning solutions at low cost and with the best quality possible.
[#VIRTUAL] KEYNOTE (AI): Samsung Next -- Synthetic Data Generation Using AI For Metaverse
Join on HopinAI has been evolving to create Synthetic Media and now we are looking forward to its impact in the future of Metaverse, which is $1T market. We will look at some novel research going on in Stanford University, UC Berkeley and MIT in this space. We will also evaluate the business impact and opportunity in this market.
[#VIRTUAL] OPEN TALK (AI): Data-Driven Models Using Graphs for Communication Networks
Join on HopinBehavior identification is a typical requirement for communication network issues, such as malicious call identification, DoS attacks, and fault recognition. Classical data-driven models using regular-structure data are widely explored unsuccessfully, due to the lack of expressivity of these types of data.
In his technical session, Caio Vinicius Dadauto will provide details of why graphs are suitable for communication networks and how to use them to improve the quality of machine learning models. He will give an overview of graph neural networks, graph kernels, and complex network metrics emphasizing the relevance of these graph properties to data-driven solutions in communication networks.
[#VIRTUAL] PRO TALK (AI): Physics-Based Graph Neural Networks Enable Composable, Strongly Typed Neural Networks
Join on HopinPassiveLogic’s (www.passivelogic.com) platform for generalized autonomy utilizing Deep Digital Twins is built on systems-level control theory. The platform is generalized because it can be used to control any kind of system. At its core, this type of platform works on the sensor-fusion and control-fusion of digital models. In these Deep Digital Twin models, the digital twin literally is the AI structure. Each digital twin utilizes the fundamentals of physics to model a single component or piece of equipment. When multiple digital twins are linked to each other in a graph neural network, they form a system description. Because their physics are integral to the models themselves, these graph-based system descriptions model not only the real complexities of systems but also their emergent behavior and the system semantics.
Deep physics networks are structured similar to neural networks, but unlike the homogeneous activation functions of neural nets, each neuron comprises unique physical equations representing a function in a thermodynamic system. The Deep Physics approach is built on heterogeneous neural nets that are composable, have physics guarantees, allow users to define their own systems, learn unsupervised, and generate a physics description of a system. Being so principled, it is also necessarily more constrained, meaning the physics-based graph neural networks can be used to predict future system behavior.
The physics-based graph neural network provides a systems-level intelligence as it understands the interconnectivity of components in a system. As such, it can automatically infer behavior and introspect results, even where sensors do not exist. Using this inference ability, an autonomous control platform built on Deep Digital Twins can provide self-commissioning, automate point-mapping, validate installation, and provide continuous system measurement and verification against its original design. Real-time system operational data can be brought into the model for real-time machine learning so that the model can adapt for improved accuracy of predicting the system behavior.
In this talk, Troy Harvey, CEO at PassiveLogic, will describe Deep Digital Twin AI structures and the applications for generalized autonomy.
[#VIRTUAL] OPEN TALK (AI): Operationalizing AI with a Shift from Research to Product Orientation
Join on HopinMany AI programs fail to deliver sustained value despite great research, due to insufficient operational tools, processes and practices. These days, more and more data science teams are going through a major shift, from research orientation, to product orientation. Key factors to successfully transition to a product-oriented approach to AI include empowering data scientists to take end to end accountability for model performance, and going beyond the model - gaining a granular understanding of the behavior of the entire AI-driven process. In this talk, Yotam will discuss the importance of empowering data science teams to successfully make the transition from research oriented to product oriented.
[#VIRTUAL] OPEN TALK (AI): Conversational AI Solutions for the Metaverse of Work
Join on HopinIs your enterprise ready to engage its customers and employees in new immersive experiences powered by web3 and the Metaverse. With Facebook's Horizons and Microsoft's Teams making significant product investments into creating underlying Metaverse Platforms for enterprises to launch both employee and customer-facing experiences, organizations would need tailored conversational strategies and specialized tools to drive effective engagement on these evolving Metaverse platforms . This session will explore the critical role of Conversational AI technologies in creating effective Metaverse solutions and experiences, and also address the key considerations for conversational AI in applications of Metaverse technologies for improving work productivity, deploying interactive learning environments, and powering e-commerce.
[#VIRTUAL] OPEN TALK (AI): Level Up Your Data Lake - to ML and Beyond
Join on HopinA data lake is primarily two things: an object store and the objects being stored. Even with the most basic setup, data lakes are capable of supporting BI, Machine Learning, and operational analytics use cases. This flexibility speaks to the strength of object stores, particularly their flexibility in integrating with a diverse set of data processing engines.
As data lakes exploded in adoption, a number of improvements were made to the first architectures. The first and most obvious improvement was to file formats, which led to the development of analytics-optimized formats like parquet, and eventually modern table formats.
An even newer improvement has been the emergence of data source control tools that bring new levels of manageability across an entire lake! In this talk, we'll cover how to incorporate these technologies into your data lake, and how they simplify workflows critical to ML experimentation, deployment of datasets, and more!
[#VIRTUAL] OPEN TALK (AI): Reducing Latency and Resource Consumption for Offline Feature Generation
Join on HopinPersonalization is one of the key pillars of Netflix as it enables each member to experience the vast collection of content tailored to their interests. Our personalization system is powered by various machine learning models. We constantly innovate by adding new features to our personalization models and running A/B tests to improve recommendations for our members. We also continue to see that providing larger training sets to our models helps make better predictions. Our ML fact store has enabled us to provide larger training sets where the training set spans over a long time window. While a great success, the ML fact store architecture has its limitations. For example, features computed while generating recommendations must be recomputed by offline feature generation pipelines. This talk is about those limitations and how we enhanced our architecture to run optimized offline feature generation pipelines.
[#VIRTUAL] OPEN TALK (AI): Bringing Life and Motion to AI Explainability
Join on HopinSHAP is a great tool to help developers and users understand black box models. To push it to the next level, we will show how to leverage on Dash, SHAP, gifs, and auto-encoders to generate interactive dashboards with animations and visual representations to understand how different AI models learn and change their minds while progressively trained with growing amounts of data.
Animations will help developers understand how frequently AI models tweak their population and local importance factors during training and how they compare across competing AI models, adding an extra layer to AI safety. Auto-encoders and LSTM will be used to generate 2-dimensional embedding representations of explainability paths at individual level, allowing developers to interactively detect algorithm decision making similarity across time and visually debug mislabeled AI predictions at each point in time.
We will show this application in the context of Chronic Kidney Disease prediction and broader Healthcare AI.