days
1
6
hours
2
0
minutes
3
1
seconds
1
0
search
Top show!

RebelLabs’ Java Productivity Report focuses solely on performance

Natali Vlatko
Boy image via Shutterstock

This year’s Developer Productivity Report chose to highlight Java performance and testing, finding that developers are happier, more productive and produce better quality code when performance testing is run during the coding process.

RebelLabs’ annual Developer Productivity Report is out and has placed the spotlight squarely on Java performance. After gathering data from over 1,500 participants, the report defined some key characteristics of performance teams and applications, finding the common cause of issues are inefficient application code and too many database queries.

Sponsored by ZeroTurnaround, the questions answered by respondents focused on application complexity, the size of teams, application profiling and performance monitoring. It also concentrated on the tools and processes developers used for performance purposes.

On the tooling side, VisualVM came out on top for application profiling, scoring 46.5% of the result. JProfiler followed in second with 25.7%, with custom in-house tools rounding out the top three (20.6%). ZeroTurnaround’s XRebel claimed 3.3% of the survey result.

When asking themselves what a high-performing software organisation looks like, report author and Developer Advocate at ZeroTurnaround Simon Maple compared the activities of those respondents who claimed their users were not affected by performance issues with those that were. The report concludes that teams with the most satisfied end-users have the following characteristics:

  • Work in smaller teams. The best performing teams with the fewest issues have 30% fewer team members.
  • Test earlier. Happy teams with satisfied users are 36% more likely to run performance testing while they code.
  • More efficient and proactive. These teams are 38% faster at diagnosing, fixing and testing performance issues; almost 40% more likely to profile on a daily or weekly basis; and are 20% less likely to test reactively when issues arise.

JAXenter had the opportunity to interview Simon Maple about the report’s results.

JAXenter: This year’s Annual Java Productivity Report focused solely on Java Performance, quoted by you as “the dark art of software development”. What was the motivation behind this focus?

Simon Maple: Each year we rotate between a broad survey to understand the technical landscape of how developers work and a specific survey about a particular discipline or area of software development. At ZeroTurnaround, we’ve recently entered the performance market with XRebel, a product only a little over a year old, so our interest in performance has increased a lot over the last couple of years. We noticed it’s still an area of the application cycle that can get left behind or ignored completely, so we wanted to go deeper and understand how developers and teams currently look at performance testing and how they go about their testing.

The report shows that the team responsible for fixing performance problems is Development, with a whopping 94% stake in the data. You say it makes sense to performance test your code as you develop it – does the high percentage indicate that this isn’t happening?

The data shows us that across 94% of all respondents, it’s the developers who fix the performance issues when they’re found. That doesn’t mean they’re found by developers or even during development. This leads us to understand than when QA teams, Operations teams or performance teams find performance related issues, anywhere in the cycle, they will most often go back to the development team in order to get a fix. This could be days, weeks or even months after the development of the feature has taken place.

In fact, other developers may have build or reused the code at fault, meaning the potential fix could already be in a tangled state. A number of reasons such as those described all lead to the fix becoming more expensive, when ultimately, it’s often the same person who’s actually making the fix irrespective of who find the bug of when it is discovered.

Do the tools used by respondents for application profiling surprise you? What about XRebel’s result of 3.3%?

We were expecting JProfiler to come out on top, and in particular I was expecting Java Mission Control to have a higher percentage, not just because it’s a great tool, but also because it’s free in development and is shipped with Oracle’s Java version 7u40. XRebel came in at 3.3% which is extremely high for a tool which only hit the market a little over a year ago. While I’d love to think this stat is accurate, there’s always going to be bias with every survey and I expect this had an impact here. We shared the survey as widely as we could, even asking influencers in the industry to help us. Our social media reach is going to include a lot of our JRebel and XRebel customers, so we do have to factor that in when looking at the results.

When looking at how the survey takers responded to the most common found performance issues, you state that “we as an industry are failing to test properly”. What, in your opinion, is the proper way to tackle testing?

While I meant that statement to be more specifically around performance testing, it does actually apply to all testing in the most part. There are people who do write great unit tests and run them regularly in a CI environment, but many still do not. In the performance testing arena, specifically, we need to change our mindset, similarly to how we did with unit testing and functional testing some time ago.

This new mindset needs to classify performance testing as a first class citizen and as something that should be run regularly, and early. Once this stage of testing gets left, it’s often performed as a best can do activity, or forgotten entirely, which only leads to a poor user experience for our customers. Developers often focus on their code, rather than quality, which isn’t always their fault. Tough schedules and pressure from project management is a great excuse as to why code is shipped without thorough functional or performance testing. Applications and code can only be as good as their testing, including unit testing, functional testing and performance testing. All three need to shift left int he process and be performed with code development, where possible.

You’ve asked the question of whether dedicated performance teams are better, with the results leaning towards them being superior in finding issues and bugs. Is there any other variable you would consider measuring this by? What about specialised performance knowledge?

It’s an extremely hard question to answer really, as every team is different and also individuals with different levels of knowledge make a huge impact. Also, one thing we didn’t explore in depth was the types of bugs each team found. I would speculate that dedicated performance testing teams will run more complex testing scenarios than developers might. Not because they have more time, but because in a dedicated role, I would expect them to have more expertise and subject matter knowledge about the performance domain.

I’d say the bugs which the performance team find, would not just be harder to find, but also harder to fix, which would explain why the amount of time taken to fix those bugs was so much higher. While real experience and knowledge are key aspects to understanding how these teams work, it’s a very objective and difficult metric to measure accurately.

The full report can be downloaded from RebelLabs here.

Author
Natali Vlatko
An Australian who calls Berlin home, via a two year love affair with Singapore. Natali was an Editorial Assistant for JAXenter.com (S&S Media Group).

Leave a Reply

Be the First to Comment!

avatar
400
  Subscribe  
Notify of