Meet Nexie, the hand-held data center
Machine Learning is becoming ubiquitous. Organizations are producing more and more data and they are looking for ways to derive intelligent insights from that data.
Cloud solutions like Amazon AWS and Google Cloud Platform have specialized services like MxNet and Tensorflow to run machine learning applications at scale in the cloud. Running Machine learning models on-prem would need enterprises to build complex infrastructure to run those models.
Today Nexla is announcing a solution for companies to run Machine learning models in their own data centers. Nexie is an ingenious piece of hardware designed by Nexla which brings machine learning into your own data center. The device can be powered on in any data center across the globe. Nexie can connect with existing storage solutions via multiplexed universal ports. Nexie comes with two ports, In & Out. The In port allows you to receive data, Out port outputs the results of the model. Multiple Nexies can be joined together to create portable clusters!
A portable cluster of Nexies
Nexie can read structured/unstructured data. Nexie has a self service UI, where you choose your source, your ML model and the destination where results need to be stored. Nexie is so small it saves considerable energy compared to traditional data centers. It is very eco-friendly, as demonstrated by the above photo with succulents.
The machine learning, AI community is delighted to have a device which takes away the stressful data wrangling tasks. The ML community can focus on iteratively running their models at record speeds. Devops/Security teams at the enterprise are very happy they don’t have to move their data out of the data center.
Currently, Nexie is undergoing rigorous testing at Nexla & its partners in Silicon Valley and will be available for GA later this year. If you want to be on the pre-order list, please sign up here.
Happy April Fool’s! Nexie cannot transmit data, but is an awesome travel adapter with two USB ports. Take a meeting with Nexla, we’ll give you one.