top of page

Speech Assistant Experiments

 

I love playing with AI assistants at my home. Most of things in my home, like lights, garage door, music, and thermostat are voice controlled.  My goal is to have everything in my home controlled by speech. The idea of speech interfaces in AI assistants like Jarvis from Iron Man is fascinating to me, and these experiments are something I like to do to make this science fiction a reality.

Alexa Controlled Lamp

 

I created this lamp using Particle Photon and a Neopixel ring. I programmed it using Particle's REST based API. For behaviors, I added different effects for the lamp such as candles, lightning, and simple colors as shown in the video. 

 

For the Speech side I connected the Particle Photon to Alexa using IFTTT. This was inducing latency in triggers, so I made a sample application using Alexa Skills kit and wrote a new Skill. 

Photobot

 

Using the sphinx library for speech recognition, I made a photography robot.

 

I programmed ROS nodes in C++ that controlled the robot movement, tracked people and, arranged them in frame to click just the right picture when you say "Take a Picture"

Google Assistant Running on Intel Euclid

 

Google announced the Google Assistant SDK at I/O 2017 and the fact that is runs on Linux.

Intel Euclid runs Ubuntu, so I implemented the Assistant running completely on Euclid. The device connected to the Euclid is just a speaker.

You can ask anything you want to the Google assistant running on Euclid. I programmed it in Python using the assistant's python API.

A UX Design Technologist passionate about creating meaningful user experiences

bottom of page