Keep calm and jClarity

jClarity launches performance tuning tool for Java in enterprise and cloud

Since the company’s inception last December, jClarity have been marshalling their Java expertise to create a new analytics product. They’ve now emerged from their East London base with the new jClarity performance tool, designed to present diagnoses and solutions to a range of Java and JVM performance issues in plain English, and bring harmony to your development team. We spoke to CEO Ben Evans for the story on their new release.

JAXenter: Can you tell us what challenges you set out to solve with the jClarity performance tool?

Evans: In the existing marketplace for performance software, one of the problems is that what’s available is very good at generating a lot of data, but what they are not good at is actually providing context that will allow you to see what the data means. What we want to do is to move the problem from data collection to data analysis and insight.

So a step up from what's on the market already?

Well, they are very useful tools, and they certainly have their place, but they can only go so far, and if you've ever been in one of those situations where you're trying to use them to debug a serious problem, you kind of reach the point of, 'And now what?', where you've got the data, but quite often the operations team aren't data analysis experts, they don't really know what it means, and how to spot patterns and correlations in it.

It's about trying to find a common ground for anyone where everyone involved in the performance analysis process can meet up. So you have operation staff,you have developers, and you have management and often - I saw this all the time in banks - what would happen is someone would write a script to collect some data which they felt was important to the problem, and then they would pass it around in an email, maybe with a couple of screenshots, and then everyone would just argue about what it meant.

Whereas, with jClarity, you have the agent, it diagnoses the problem for you and it can pinpoint about eight or nine large classes of problem that applications have, then it stores that record of it, so people can go back and refer to it. It provides you with a lot more context and a lot more high level pointers as to what's actually going on.

The other interesting thing about it is that, traditionally, people have always focused on performance monitoring and production, but if you have analysis tools, you can do something more than just monitor production. You can actually check code as it's going in, for example, in a QA or a pre-production, or a performance testing environment. You can actually analyse software and products before they get anywhere near a live customers, So, as well as the early warnings, the, 'Oh god we've got a problem, we need to fix it now!' side, we can also do massive risk reduction.

Would you say that this significantly cuts down the performance testing process?

Often in large enterprise you have an external testing team, and they are probably in many cases not well thought of by the developers, and they get a bit of a hard time, and you see the same pattern repeatedly; the QA team go, 'I'm not sure about this release', but the development team go, 'Oh we wrote it, so it's fine', and so the QA guys end up getting overruled. The release goes in, and then it breaks stuff. But, with our program in place, you don't have that problem, because the QA team now have something concrete to print out. Development teams have to justify themselves to the QA team, and it's a lot more quantitative and it provides that level of communication which is lacking, I think, in other tools.

Can you tell us more about the kind of Java and JVM issues which you think jClarity’s new product will particularly help developers with?

We can do anything from threading to garbage collection, to application code, to external problems, something coming up on a database connection or something like that, so we really do think that we've covered all of the major bases where web performance problems can hide.

Developers often have, as we all do, certain cognitive biases. If you talk to a developer about what makes an application and what makes it slow, they will always think of code, because that's what they know about and that's what they're trained to think about.

Quite often, in my experience consulting, and talking about, and listening to customers that have these kind of performance problems, developers’ application code and algorithms are a problem probably only ten percent of the time. Nine times out of ten, it usually is something other than the way that the code has been written. So that's one of the other cool things about jClarity  - it removes that human bias. It's the machine itself that sees where the problem is and will tell you where to find it.

You describe yourself as 'pioneering'- is there anyone else at the moment that's doing something similar to what you, or do you feel like you're totally unique on the market?

I would say that we are highly complementary to tools that are  already out there, because no-one does exactly what we do,  but at the same time, we're only a small start up, we only have limited resources, we haven't tried to go out and solve all the problems that already exist. For example, JAXenter readers may have New Relic or they may have Appdynamics, and you know, they're good tools. But they don't do what we do.

I think people in a similar space to us can't do the kind of analysis we can, because we're focused on really taking the essence of the problem and boiling it down and producing a very smart system which learns. And that's the other thing about our system - once it's calibrated and once it's on the machine, it just continues to get smarter.

You told us about your future objectives when we last spoke in January, which were to get your tools in the hands of developers and ops folk that need them, reduce the stress levels of fellow technologists,  build your piece of the "real Cloud", and keep up the intensity & building tools that raise the standard of the industry. How far have you come in achieving these goals?

Well, the with progress that we've made with jClarity, I think now we're just about to see a payoff. We've discovered with some of our earlier customers that the way jClarity can really be used is part of a production build team, and also in QA.

It's also become very clear to us that what we're building is cloud native, but also can be applied in the enterprise. And in fact, one of the places where we are really expecting to see a lot of value for customers with jClarity is people who are about to make their first cloud deployment.

If you look where large enterprise spend is happening, everyone's talking about the cloud, but not that many enterprises are fully cloud native and some are still struggling with their initial deployments.  So, the performance characteristics of a cloudless application are totally different to one which has been hosted on their premises, because you have virtualisation, you're sharing resources, it just doesn't look that similar, and if you're a bank or a financial, you're used to just buying a big load of servers and just building out a data centre and not having to think about how your resources would be shared.

In many cases that means you're carrying a whole lot of risk, and if you start to move those applications across on to cloud sensors, there's a real possibility that something's going to break. If you have jClarity running, you put it on the QA run for your cloud deployment, and you can try it out. We can find a lot of those problems for you, before you go live on cloud.

We do also maintain a number of open source projects that are on our GitHub account, in particular there are testing tools, and some of the things we've open sourced are projects run deliberately badly, because one of the problems we've found is we need a nice clear signal in order to prove that our application and our tooling works properly.

Are there any other projects in the pipeline now you’ve launched your flagship product?

Very good question. We are looking very closely at customers and seeing what they are developing, and what they're going to need, and I would think that post-JavaOne, and all the conversations there that we'll be having with people. we should have a much clearer picture of what our roadmap is going to look like for the next year.

It's not necessarily always about the core analysis capabilities. Of course we'll continue to refine those, of course we'll continue to add in new features and deeper dives into the areas that customers say they are interested and want more detail, but the things that we won't compromise on are complete focus on customers. One of the things that we realised when we started looking at the space was that a lot of the products on the market didn't always have great customer interface. We want our product to be easy to use and straightforward, even for people who aren't professionals.








Lucy Carey

What do you think?

JAX Magazine - 2014 - 03 Exclucively for iPad users JAX Magazine on Android

Comments

Latest opinions