TECH CORE Crunchfish is a technology company that provides groundbreaking software to enable touchless interaction with any mobile device using the device's embedded camera. Its Touchless A3D SDK is an exciting step forward in consumer electronics, as Crunchfish steams ahead on its mission to simplify interaction with mobile devices. Here, Smart Chimps talks to Joakim Nydemark, CEO at Crunchfish, about Touchless A3D SDK and what new features are being demonstrated at Mobile World Congress 2014 in Barcelona, Spain.
Smart Chimps: Give us some background on Crunchfish.
Joakim Nydemark: Crunchfish was founded in 2010 with an initial focus on creating innovative applications for the iOS and Android app markets. Gesture recognition was one of the best innovations and Crunchfish is today completely dedicated to exploring touchless interaction based on gestures.
SC: What gave you the idea for this technology?
JN: We believe today's interaction with electronic devices is full of compromises and wanted to change that by developing a more intuitive and beautiful way of interacting. We are living in a three dimensional world, so why should our interaction with devices be two dimensional?
With touchless interaction you can use all three dimensions in your interactions, and bring a lot of convenience into it.
SC: How did you develop the Touchless A3D SDK with your team at Crunchfish?
JN: Our core software is developed from scratch by our strong engineering team. The team has backgrounds from the likes of computer vision, embedded systems and java.
Our development of our touchless technology has been ongoing since the company was founded in 2010.SC: What challenges have you faced along the way to creating Touchless A3D SDK?
JN: One of the main challenges has been the fact that touchless interaction is still in an early phase commercially. The number of devices on the market is limited and then so also are the number of users. What we are experiencing now is that most of the device vendors out there are planning to launch touchless-enabled devices during this year, so we really feel that this is taking off now.SC: How does Touchless A3D SDK work?
JN: We are using the camera sensor on the phone to detect and track hands, fingers, fists and heads. We then map these gestures to control functions in the device like scroll, swipe, grab etc.SC: What uses do you envisage this having right now?
JN: We see a number of different use cases. On the market in commercial devices we have photo swipe, answer and reject calls, and video controls. What we are showing as demos during the Mobile World Congress trade show is touchless photo capture, copy and paste, content sharing, video player controls and copy to print.
SC: How might the use of this tech evolve?
JN: We are currently targeting mobile devices and tablets, but naturally this will grow into other devices as well including TV, Whitegoods, automotive and more.
We are also looking at different specific verticals like hospital environments and digital signage where you are not able to touch the device and therefore need a way to interact with it from a distance.SC: Thanks Joakim!