A “Dialogue” on the recent advances in Conversational Artificial Intelligence (AI)

How important is it to interact, converse and emote in a world that is getting closed and parochial? Conversational Artificial Intelligence (AI) offers a leeway to build agents that have the capability to learn and respond like humans and thereby align in bringing the long term goal of General AI to fruition.

Conversation with artificial assistants, be it Microsoft’s Cortana, Apple’s Siri, Google Now or Amazon’s Alexa is gaining prominence in the last few years. So lay back, relax and enjoy the simple conversational interface at offer, as I take you through a short tour!

In this 2 part blog-series, I cover the latest developments in the field of Dialogue and conversational Artificial Intelligence (AI). I give a brief overview of the current developments from this field, the many Language Understanding tools in the market and in particular, review one of them – IBM Conversation.

It’s a rat race – So act and don’t over think!

After the horrors of Tay tweets -Microsoft’s conversational AI tweet bot that was eventually rolled back due to its racist and sexist tweets early this year, AI enthusiasts have had some good news over the last few months.

nycitizen07-tweet

Microsoft hurried the launch of Tay tweets, its conversational AI bot only to shun it completely.

The Amazon Echo, Google’s Home and the smart home hub Apple has been preparing are good examples of how big companies are fighting tooth and nail to secure a place on your smart space. Here’s what Francis Chollet, researcher at Google and author of the popular framework – Keras has to say,

Whatever idea you started working on last week, a few other teams have probably been working on it for a month and are about to publish.
— François Chollet (@fchollet) October 5, 2016

Alexa Prize Competition

Just 4 weeks back, Amazon announced the Alexa Prize, an annual competition for university students dedicated to accelerating the field of conversational AI. This inaugural competition focuses on creating a social bot, using the Alexa Skills Kit (ASK) to converse coherently and engaging with humans on popular topics and news events. This gives student developer teams to explore a plethora of advanced topics in the realm of AI that include knowledge acquisition, natural language understanding, natural language generation, context modeling, commonsense reasoning and dialog planning. With a huge cash prize at stake, goodies at offer and support from the ASK team it would be worth an experience to build a socially coherent bot!  The last date of team submissions is October 28, 2016 and more details about the application process can be found here.

Say Allo!

Google Allo, a smart messaging app that has personalized recommendations with the Google Assistant to express yourself better with stickers, doodles, and HUGE emojis & text. Allo also allows you to get help from your Google Assistant without leaving the conversation. A one to one conversation can be initiated with your Assistant which gets better as you use it more by addressing it with the @google tag. More functional details on the blog Say hello to Google Allo: a smarter messaging app

IBM Pepper developer Conference

The IBM BusinessConnect 2016 on 4th October 2016 in Stockholm, Sweden showcased some of IBM Watson powered tools, and applications in humanoid robot of Pepper.

Yesterdays #IBMBCSE at Stockholm Waterfront was fantastic thanks to all IBMers, partners and customers, and thanks to #Pepper of course! pic.twitter.com/quZuaptu8Z
— IBM ClientCtr Nordic (@IBMCCNordic) October 5, 2016

IBM’s Pepper is powered by SoftBank robot and uses IBM Watson technology at its core.

Banzai! (Live long) – Watch this first home robot commercial as the unforeseen future is coming!

The Watson Developer Conference is packed with technical talks, hands-on labs, and coding challenges to get you working with the tools that will make you a sought after developer and is going to be held in San Francisco from 9th to 10th November this year.

ibm-robot

The IBM Global Industry Solution is located in Nice, France.

Joie de vivre – Samsung buys Viv

And after Google’s Allo and IBM’s pepper it was Samsung to jump into the Dialogue based conversational AI bandwagon as it acquired Viv, creators of Apple’s Siri. Viv is a more powerful version to Siri that brings in ubiquity. With its self-generating software that is capable of writing its own code to accomplish new tasks and by dynamic program generation, Viv handles new user tasks and build plans on the fly!

In its demo video on “Beyond Siri: The World Premiere of Viv with Dag Kittlaus” (as in the embedded link/video below) earlier this year, Viv was eventually be partnered or sold to a mobile device.

With everyone wanting to invest heavily, the question was who and when! Hence, this announcement from Samsung doesn’t come as a big surprise.

Viv will ultimately provide services to Samsung and its platforms but remain an independent entity. Samsung hopes to disrupt the mobile market share with this acquisition. It can extend it to other home devices, after all it had purchased SmartThings for around $200M back in 2014. More details on the acquisition here: Samsung acquires Viv, a next-gen AI assistant built by the creators of Apple’s Siri

Don’t take it slow because there is Ozlo !

Ozlo launched few days back on iOS and the web is another of the many sprouting AI assistants which uses good memory of one’s previous interactions. Ozlo, at least by its name attempts to be different than all assistants of its competitors in the market at present that use repetitive female names. The best thing is that it is integrated with a plethora of services like Yelp, TripAdvisor,IMDB, among many others and use  Further Food, Authority Nutrition, Cookies, etc. to provide nutritional guidance. This is a huge boost than all of its rival companies which tend to prioritize their own services rather than integrating with existing services. An in-depth review can be found here: Ozlo AI assistant is the new underdog filling the void left by Viv

And there were rumors that Apple is going to buy McLaren, which set the eyeballs rolling as a big tech giant was entering a completely new domain of automobile industry and would lead others like Google, Microsoft and IBM to follow suit and invest heavily!

Conference workshops also wanting a dialogue!

There are in total 50 workshops at NIPS 2016 this year covering a range of different Machine Learning topics.

  1. The Dialog workshop, scheduled on the 10th of December focuses on building agents capable of mutually coordinating with humans via communication. And given the tremendous economic potential of the ability to converse intimately transcends to the overall goal of AI.
    For the call for papers, the deadline is extended to the midnight of October 23, 2016 and more details about the workshop schedule can be found at the chair website LET’S DISCUSS: LEARNING METHODS FOR DIALOGUE NIPS 2016 WORKSHOP The papers are on the below three high-level areas

    • Being data-driven especially the offline/online evaluation
    • Build complete applications or end-to-end systems
    • Model innovation to incorporate linguistic knowledge into the architecture
  1. Another workshop on Interactive machine learning (IML) is to be held on the 9th of December. It focuses on the adaptable collaboration of how autonomous agents solve a task by making use of interactions with humans. Designing and engineering fully autonomous agents is a difficult and there is a compelling need for IML algorithms that enable artificial and human agents to collaborate and solve independent or shared goals.
    The call for papers explores new ideas in interactive learning, reports on research in progress as well as discussions of open problems and challenges facing interactive machine learning with particular interest in the research on the practical application of interactive learning systems (for robotics, virtual agents, dialog systems, among others), and the ability of these systems to handle the complexity of real world problems. More details about the application process, requirements, application deadline, etc. is at the workshop portal Future of Interactive Learning Machines Workshop (FILM at NIPS 2016)

In the next part of this series on Conversational AI, I would cover the basics behind Language Understanding tools in the market that enable to build a Dialogue system.

Read the second Part here: A review of Language Understanding tools – IBM Conversation

Data Science on a large scale – can it be done?

Analytics drives business

In today’s digital world, data has become the crucial success factor for businesses as they seek to maintain a competitive advantage, and there are numerous examples of how companies have found smart ways of monetizing data and deriving value accordingly.

On the one hand, many companies use data analytics to streamline production lines, optimize marketing channels, minimize logistics costs and improve customer retention rates.  These use cases are often described under the umbrella term of operational BI, where decisions are based on data to improve a company’s internal operations, whether that be a company in the manufacturing industry or an e-commerce platform.

On the other hand, over the last few years, a whole range of new service-oriented companies have popped up whose revenue models wholly depend on data analytics.  These Data-Driven Businesses have contributed largely to the ongoing development of new technologies that make it possible to process and analyze large amounts of data to find the right insights.  The better these technologies are leveraged, the better their value-add and the better for their business success.  Indeed, without data and data analytics, they don’t have a business.

Data Science – hype or has it always been around?Druck

In my opinion, there is too much buzz around the new era of data scientists.  Ten years ago, people simply called it data mining, describing similar skills and methods.  What has actually changed is the fact that businesses are now confronted with new types of data sources such as mobile devices and data-driven applications rather than statistical methodologies.  I described that idea in detail in my recent post Let’s replace the Vs of Big Data with a single D.

But, of course, you cannot deny that the importance of these data crunchers has increased significantly. The art of mining data mountains (or perhaps I should say “diving through data lakes”) to find appropriate insights and models and then find the right answers to urgent, business-critical questions has become very popular these days.

The challenge: Data Science with large volumes?

Michael Stonebraker, winner of the Turing Award 2014, has been quoted as saying: “The change will come when business analysts who work with SQL on large amounts of data give way to data EXASOL Pipelinescientists, which will involve more sophisticated analysis, predictive modeling, regressions and Bayesian classification. That stuff at scale doesn’t work well on anyone’s engine right now. If you want to do complex analytics on big data, you have a big problem right now.”

And if you look at the limitations of existing statistical environments out there using R, Python, Java, Julia and other languages, I think he is absolutely right.  Once the data scientists have to handle larger volumes, the tools are just not powerful and scalable enough.  This results in data sampling or aggregation to make statistical algorithms applicable at all.

A new architecture for “Big Data Science”

We at EXASOL have worked hard to develop a smart solution to respond to this challenge.  Imagine that it is possible to use raw data and intelligent statistical models on very large data sets, directly at the place where the data is stored.  Where the data is processed in-memory to achieve optimal performance, all distributed across a powerful MPP cluster of servers, in an environment where you can now “install” the programming language of your choice.

Sounds far-fetched?  If you are not convinced, then I highly recommend you have a look at our brand-new in-database analytic programming platform, which is deeply integrated in our parallel in-memory engine and extensible through using nearly any programming language and statistical library.

For further information on our approach to big data science, go ahead and download a copy of our technical whitepaper:  Big Data Science – The future of analytics.

A quick primer on TensorFlow – Google’s machine learning workhorse

Introducing Google Brains‘ TensorFlow™

This week started with major news for the machine learning and data science community: the Google Brain Team announced the open sourcing of TensorFlow, their numerical library for tensor network computations. This software is actively developed (and used!) within Google and builds on many of Google’s large scale neural network applications such as automatic image labeling and captioning as well as the speech recognition in Google’s apps.

TensorFlow in bullet points

Here are the main features:

  • Supports deep neural networks – and much more machine learning approaches
  • Highly scalable across many machines and huge data sets
  • Runs on desktops, servers, in cloud and even mobile devices
  • Computation can run on CPUs, GPUs or both
  • All this flexibility is covered by a single API making the execution very streamlined
  • Available interfaces: C++ and Python. More will follow (Java, R, Lua, Go…)
  • Comes with many tools helping to build and visualize the data flow networks
  • Includes a powerful gradient based optimizer with auto-differentiation
  • Extensible with C++
  • Usable for commercial applications – released under Apache Software Licence 2.0

Tensor, what? Tensor, why?

„Numerical library for tensor network computations“ maybe doesn’t sound too exciting, but let’s  consider the implications.

Application of tensors and their networks is a relatively new (but fast evolving) approach in machine learning. Tensors, if you recall your algebra classes, are simply n-dimensional data arrays (so a scalar is a 0th order tensor, a vector is 1st order, and a matrix a 2nd order matrix).

A simple practical example of is color image’s RGB layers (essentially three 2D matrices combined into a 3rd order tensor). Or a more business minded example – if your data source generates a table (a 2D array) every hour, you can look at the full data set as a 3rd order tensor – time being the extra dimension.

Tensor networks then represent “data flow graphs”, where the edges are your multi-dimensional data sets and nodes are the mathematical operations on this data.

Example of of a data flow graph with multiple nodes (data operations). Notice how the execution of nodes is asynchronous. This allows incredible scalability across many machines. Image Source.

Looking at your data through the tensor formalism gives you a lot of powerful tools that were already developed for tensor algebra, allowing fast, complex computations.  

Tensor networks are also a natural fit for computations done on graphical processing units (GPUs) as they are built exactly for the purpose of very fast numerical operations on such a data – speeding up your calculations significantly compared to standard CPU execution!

The importance of flexible architecture & scaling

The data flow graph approach has also further advantages. Most notably, you can split the design of your data flows (i.e. data cleaning, processing, transformations, model building etc.) from its execution. You first build up the graph of your data flow and then you send it to for execution: either on the CPUs of your machines (and it can be your laptop just as well as cluster) or GPUs or a combination. This happens through a single interface that hides all the complexities from you.

Since the execution is asynchronous it scales across many machines and can deal with huge amounts of data.

You can count on the Google guys to build tools not only for academic use, but also heavy-duty operations in the industry!

Is this just another deep learning library?

TensorFlow is of course not the first library to embrace the tensor formalism and GPU execution. The nearest comparisons (and competitors) are Theano, Torch and CGT (Caffe to a limited degree).

While there are significant overlaps between the libraries, TensorFlow tries to provide a broader framework. It is not only a deep learning library – the Data Flow Graphs can incorporate any data processing/analysis applications. It also comes with a very powerful gradient based optimizer with automatic calculations of derivatives offering huge flexibility.

Given this broad vision the closest competitor is probably Theano (while Caffe and the existing Theano wrappers have a narrower focus on deep learning). TensorFlow’s distinguishing feature is that by design its focus is on large, scalable architectures with a complete flexibility in the hardware, best suited for industry/operational use, whereas the other libraries have more academic pedigrees.

Initial analyses also indicate that TensorFlow should bring also performance improvements compared to Theano, although no comprehensive benchmarks have yet been published.

As the other packages are out already for a while, they have large, active communities and often additional supporting software (examples are the very useful wrappers around Theano like Lasagne, Keras and Blocks that provider higher level abstractions to its engine).

Of course, with Google’s gravitas, one can expect that TensorFlow’s open source community will grow very fast and the contributors will quickly add a lot of additional features (and find hidden bugs).

Finally, keep in mind, that while Google provided us with this great data processing framework and some of its machine learning capabilities, it is likely that the most powerful machine learning algorithms still remain Google’s proprietary secret.

Nonetheless, TensorFlow is a huge and very welcome contribution to the open source machine learning world!

Where to go next?

You can find Google’s getting started guide here. The TensorFlow white paper is worth a read too. Source code can be found at the Github page. There is also a Vagrant virtual machine with TensorFlow pre-installed available here.