Google’s single-board computer enhances on-device machine learning
Just how tiny can machine learning get? The latest hardware from Google brings machine learning to mobile and IoT devices. Coral is a platform for development with local AI and includes a powerful complete system dev board and a USB accelerator accessory device.
The TensorFlow Dev Summit 2019 continued to roll out the goodies with new updates to software and hardware announcements. When it comes to AI and machine learning, Google is no stranger to new innovations. Now, there’s a newcomer to that mix.
Meet Coral: a platform for development with local AI. Currently in Beta mode, Coral consists of a development board and a USB accelerator stick. It has low power demands for usage in embedded applications and can be deployed offline or in areas with limited Internet connectivity.
See what powerful machine learning these pieces of hardware can do.
Coral Dev board
The Coral dev board is a single-board computer that includes the Edge TPU integrated on the (removable) SOM. It is a complete system that can be used for prototyping IoT devices and embedded systems. It includes a small footprint and allows for high speed.
You can run any Linux tool on the dev board, as it uses a derivative of Debian Linux.
Check out the specs of the dev board. It currently sells for 149.00 USD and is available for purchase on Mouser.com.
Go more in-depth with the datasheet.
While the dev board is a complete system, the USB accelerator is an accessory device. It works with Raspberry Pi and other Linux systems and sells for 74.99 USD on Mouser.com. It uses a USB 3.0 Type-C socket.
Here are the specs:
Can I also just say, the USB accelerator is adorably small at just 65 mm x 30 mm.
The hardware itself is well done. It hosts some impressive power inside of those tiny dimensions. (How small is the Edge TPU chip? Look at it compared to a US penny. Don’t drop it, you’ll never find it again. Let’s take a moment and marvel at how far hardware sizes have come in just a matter of decades!)
While looking at the device, one negative stands out among all the positives. You will need to use Google’s cloud model compiler. This means that you cannot have a compiler that runs locally, which may turn off some potential customers. Does the affordable price make up for this and does it affect your interest in it?
Let’s hope that Google continues to invest time into this new creation of theirs. A few times in the past we have been burned by Google discontinuing a great service (who else is still salty about Inbox?).
Will Coral be the basis for your next machine learning creation? Is it already in your cart or on its way to your doorstep?