UPDATED 15:58 EST / DECEMBER 04 2017

BIG DATA

Accelerated processing teaches autonomous cars to drive in minutes

Speed is the name of the game in the processor world, and the latest competitive sprint down the innovation track involves field programmable gate arrays, known as FPGAs. Since Intel Corp. announced FPGA acceleration platforms operating with Xeon CPUs early last month, several companies have been showcasing a number of use cases for the lightning-fast technology.

At a computing conference last month, one firm created buzz among attendees with a machine learning-based demonstration where simulated cars were taught to drive in less time than it takes to eat a sandwich.

“Within a few minutes and about 15 million simulations, the cars start driving better than humans,” said John Lockwood (pictured), chief executive officer at Algo-Logic Systems Inc. “You can give [machines] man-years of experience in a few minutes with these scale-out computer systems.”

Lockwood visited theCUBE, SiliconANGLE Media’s mobile livestreaming studio, and spoke with host Jeff Frick (@JeffFrick) during the recent Supercomputing event in Denver, Colorado. They discussed FPGA-based applications for the financial world and early tests of machine-to-machine communication in the drone community. (* Disclosure below.)

Algo-Logic’s network-attached Key Value Store in an FPGA scales out machine learning and enables faster decision-making, according to Lockwood. This kind of solution forms a growing ecosystem that not only powers machine learning applications for cars, but also high-frequency trading in the financial world.

Millions of sub-microsecond trades

Speed has become paramount in high-frequency trading, where millions of transactions are executed in the blink of an eye. Algo-Logic has developed a trading system technology with sub-microsecond latency, where futures market data comes in and fixed orders are placed out to market.

“That happens in under a microsecond, under a millionth of a second,” Lockwood said. “That beats every other software system that’s being used.”

Rapid advances in deep learning and decision-making are raising the possibility that machine-to-machine communication will lead to some intriguing future applications. Early examples can be found in the drone community where groups or “swarms” of the unmanned aerial vehicles are being tested in the skies.

Earlier this year, the U.S. military announced that it released 103 Perdix drones that performed a mock surveillance mission while communicating with each other. “It’s about how to make machines interact better with other machines,” Lockwood said. “Having a swarm or a cluster of these machines that work with each other, you can really do interesting things.”

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the Supercomputing 2017 conference. (* Disclosure: TheCUBE is a paid media partner for the Super Computing 2017 conference. Neither Intel, the event sponsor, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

Since you’re here …

… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.