Theo Vassilakis, CEO and co-founder of Metanautix spoke to
JaxEnter about Quest. Theo shared a little bit about the company,
the problems that Quest can solve and their predictions for the
world of big data.
JAXenter: Can you tell us what new features is Quest bringing
to the big data analytics table?
TV: Metanautix Quest allows analysts to easily access and
combine data from disparate silos into easy-to-understand tables –
whether the data is records, logs, documents, audio files, images,
or videos. Because it’s based on standard SQL, the solution works
within an organization’s existing toolset and doesn’t require data
be moved to a centralized system.
Metanautix Quest simplifies and speeds-up data analysis for more
effective collaboration, governance, security, privacy and
compliance by putting next-generation distributed systems
technology at the fingertips of any business analyst.
What kind of problems are you
hoping to solve with Quest?
We are in the analytics market and we see ourselves as a Data
Supply Chain company. Right now, enterprises
spend a great deal of money and time navigating
the complexity of a slow, cumbersome and opaque data pipelines.
Narrow, specialized technologies, while best suited for a certain
types of data, make it very difficult to access and combine data of
different types, such as
records, logs, documents, audio, images, and video.
To get the job done, enterprises
have to employ a team of specialized resources, leading to delays.
And, ultimately, the resulting analytics are difficult for people
across the organization to understand. Metanautix is addressing an enterprise’s entire
data ecosystem, speeding up time to analysis, providing greater
access across the organization, and encouraging
collaboration.
Can you tell us a bit about the technology that Quest is based
on? What would we see when look under the hood?
My co-founder and CTO Toli and I
worked at Facebook and Google – for me it was Dremel (the backend
to BigQuery and other systems) and for Toli it was the photo
processing backend for timeline, a huge system. But those
experiences also helped us realize that many of those systems are
aimed at somewhat specialized organizations. So, we set out
to build a system that works in a more traditional enterprise
environment. For example, we spent substantial time on being
able to run on-premises rather than just in the cloud.
We can run on small machines and laptops, as well as big
servers, and thousands of machines in data centers. And it’s the
same technology inside that can analyze JSON and text alongside
excel, nosql, etc. We also combined Toli’s background in images and
computer graphics to ask the question: why is media data treated
differently? We can query raw images, video, and audio like
ordinary tables that can be joined with traditional data.
Why SQL?
Our solution marries the high-level functionality and ease of
use found with standard SQL with next generation distributed
computing, re-imagining SQL for the new world
of big data. People often have a somewhat
restricted view of SQL: traditionally, it requires importing data
into the database, then running queries. We’re re-imagining it to
say: what if you didn’t have to do a difficult import step before
you started querying? We
make it easy to point Metanautix Quest at any data and start
querying right away, no matter the size of the data or its
format.
Because SQL is standard, it means
we can plug into standard tools like Excel, Tableau, and others
through ODBC/JDBC so people who don’t actually write queries can use their
standard tools. The deeper advantages of SQL are that it’s a
declarative language – it helps you say what to do, not how to do
it. So it gives Metanautix an opportunity to optimize your question
over a large cluster of
machines and schedule it. It also makes it possible to analyze the
flow of information more accurately so it’s easy to trace whether
the data is being used according to policy. It’s also easier to
understand who is reading what data and what they are
using it for. It’s also
an opportunity to learn from it.
What have your early customers like Shutterfly been able to do
with Quest?
Our work with Shutterfly has been very rewarding for both sides.
For an analytics platform like Metanautix, there’s nothing
better than to see the systems in use in actual production
applications with large companies. Shutterfly was interested in
analyzing all the orders they receive through their e-commerce
sites to optimize their marketing spend and maximize ROI. It turned
to Metanautix Quest to help with multi-touch attribution analysis
(MTA) to identify which channels and campaigns are most effective
in driving revenue across millions of customers and
touchpoints.
While this analysis is typically complex, laborious and slow,
Metanautix Quest helped them simplify and accelerate the process –
reducing wait times from days to minutes. By writing the entire
pipeline in SQL, it became easier for more people on the team to
understand the computation and work on it. As a
consumer-oriented business with seasonal spikes in revenue, this
ability to iterate models quickly is key to impacting millions in
sales. Metanautix’s data compute engine enabled Shutterfly to
continually refine its marketing campaigns at a speed that improves
their e-commerce business. They’ll be giving a talk with us at the
Tableau conference in Seattle that’s in full swing like now.
How do you imagine the field of big data analytics to change
over the next years?
What’s going to happen next in analytics is going to be very
exciting. Toli + I feel like we’ve glimpsed into the future a
bit because companies like Google, Facebook, Microsoft where we
worked operate at such a scale, that they often have to build the
future a bit to keep going. There’s a great William Gibson
quote that says something like “The future is here, it’s just not
widely deployed yet”. But how things actually go is always
surprising. For instance, one of the things that I think is
already happening but will become much more common is for many more
people in an organization to work with data.
There’s just so much more data coming in through mobile,
sensors, devices, web systems, etc. It’s no longer going to
be possible to wait for data to arrive at a warehouse, or
reservoir; people will need to just go get it where it is. I
think the other big changes will be in that people will become even
more educated about the technology and what it enables for them.
At the moment, people think of cloud, and storage, and
analytics, and big data as being somewhat tied together, partly
because that’s how they started.
But lots of technology is going
to get disaggregated so that people can use just the part they need
without being locked in to the parts they don’t want. In some
sense that’s what Metanautix is doing as well. We’re
disaggregating or unbundling the execution and computation part of
the database so people can use it wherever they want. So,
there will be a lot more flexibility so that enterprises and all
organizations can adapt more quickly to an evolving
environment.
Leave a Reply