YouTube Facebook LinkedIn Google+ Twitter Xingrss  

Hello Siri, Please Start My Experiment Now


By Kevin Davies and Allison Proffitt  

April 25, 2012 |  BOSTON – Transported back to the year 1986 in the movie Star Trek IV and finding himself unable to start a computer with a simple voice command, Scotty is handed a more traditional interface.   

“A keyboard? How quaint,” he grumbles.   

Roll the clock 26 years forward, and BT Global Services may have taken one small step to engineering voice commands for science projects – with a little help from its friends.  

In a live demo at the Bio-IT World Conference today, Bas Burger, BT Global Services’ president of global commerce, showed the use of Siri, the Apple iPhone’s voice-activated natural language processing technology, to launch, run and deliver the results of an experiment in the newly announced BT for Life Sciences R&D cloud.  

“Hi Bas, this experiment has been approved within your budget,” Siri said. And later, “Your experiment is completed. Would you like the results?”  

The demo was produced to showcase the newly launched BT cloud service, in conjunction with software from Acclerys and assistance provided by BioTeam, a Boston-based consultancy specializing in IT and compute infrastructure for life sciences.   

“We’re using BT’s ability to use Cloud services, we’re running Accelrys software, and BioTeam pieced it all together with Siri,” says BioTeam CEO Stan Gloss.  

Voice Intercept   

The idea came out of a ‘hot house’ innovation session hosted by BT Global Services last September. The original plan was to use natural language to run the BT Cloud, but that changed once Apple launched 4S and Siri, explains Gloss. His colleague Bill van Etten started researching the use of Siri and deploying a proxy server to do the user’s bidding.  

In the demo, Burger asks Siri to run an experiment, which launches an experiment on BT Compute – in this specific example, a NAMD molecular dynamics protocol running on Accelrys’ Pipeline Pilot software. The protein molecule is rendered in Accelrys’ DiscoveryStudio software before the results are delivered back to Burger’s iPhone.  

The way Siri works is that normally, an analog recording of the user command is statistically analyzed and sent to a server at Apple. The response is decoded, resulting in a command or question sent back to the user’s phone.  

What van Etten did was to set up a proxy server between the phone and Apple that intercepts every voice command relayed by Siri going to the Apple servers. “The same thing happens: an analog signal is built into a statistical model, but then it goes to one of our servers. We decide if we want it do something with that command. If we do, we tell it to do something,” he told Bio-IT World.  

“I put my phone on the company’s VPN,” says van Etten. “When it hears a Siri request to go to guzzoni.apple.com, we redirect it to our proxy server. A program filters all the traffic. If it hears key phrases, it deals with them; otherwise it forwards them to Apple.”  

van Etten wrote some code to recognize a particular phrase. Once detected, instructions are sent to a SOAP interface of Pipeline Pilot on the BT Cloud to authorize and run a specific experiment. Van Etten scripts the various responses, such as “OK, I’ve started your experiment for you,” or “I am sorry, I could not start your experiment.”  

Once the modeling project was completed, Siri asks if the user would like the results, before serving them back to the iPhone.   

Mobile computing   

Although just one tiny example, Gloss thinks these could be the first steps in a wave of mobile computing applications. For example, do I have to make a decision to go back into the lab when home or out on the road? Do I need to see information in real time while working with both hands (such as in a fume hood)? Gloss also envisions Siri talking to BioTeam’s MiniLIMS product and other applications. “We used to think of computers as just tools for our research, now computers will be our partners in research,” he said.  

But van Etten thinks the demonstration has more profound implications. “This is a natural progression of the user interface,” he said. “We’ll soon be migrating from keyboards to voice-initiated computing. Don’t limit your view of this to just mobile computing. I believe it’s a natural language interface to computing. The phone is just what you talk to.”  

van Etten says that Siri provides a simpler interface. “Ten years ago, the only way you could run these jobs was via a UNIX command line. We tried to put a web interface on that, but even that can be intimidating for many people… This is a simpler interface.”  

According to Jeremy Griggs, head of BT’s industry solutions/global commerce group, life sciences is a main area of focus for the company, as it sees challenges in areas such as externalization and data modeling and simulation as particularly attractive targets for cloud-based solutions.   

BT plans to offer a service store in the next few months, so users can provision what they need and add tools for global file transfer and collaboration to their library over time.   

Click here to login and leave a comment.  

0 Comments

Add Comment

Text Only 2000 character limit

Page 1 of 1

For reprints and/or copyright permission, please contact  Jay Mulhern, (781) 972-1359, jmulhern@healthtech.com.