days
-7
0
hours
0
-5
minutes
0
0
seconds
-4
-1
search
New and improved

Welcome Julia v1.0! Countless improvements for the computational scientific language

Sarah Schlothauer
julia
© Shutterstock / RPPD

Julia v1.0 is finally here. What does this language bring to the table and what can you expect to find in its new major release? Let’s take a look at the labor of love and see what this high-speed, technical language is all about.

Julia first entered the TIOBE top 50 programming languages in September 2016, and since then it has seen some growth and dips. Hitting its all-time high at number 37 in March 2018, it’s current rank for August 2018 edges itself into the charts at number 50.

This underdog of programming languages has some tricks up its sleeve and we expect to see a wider adoption of it with the release of v1.0. Let’s take a look at what Julia is capable of, and what the highly anticipated new release includes.

I wish, I wish…

When Julia was created, it took inspiration from the best parts of other languages, planning to fuse them together into the perfect programming chimera. Its creators listed their wishes back in 2012: the speed of C, the dynamism of Ruby, the true macros of Lisp, mathematical notion like Matlab, as usable for general programming as Python, as easy for stats as R.

The list is exhaustive but doesn’t stop there. “…we want something that provides the distributed power of Hadoop – without the kilobytes of boilerplate Java and XML…We want to write simple scalar loops that compile down to tight machine code…”

This is a tall order for a language and readers may half expect them to also start wishing for a pet unicorn and a castle. However, Julia isn’t all fantasy.

SEE ALSO: Can Julia give us everything?

Language features

In 2016, we explored the popularity of Julia with the Federal Reserve Bank of New York. FRBNY switched from MATLAB over to Julia and has been singing its praises since. Their reason for the switch is as given: “Julia boasts performance as fast as that of languages like C or Fortran, and is still simple to learn…We tested our code and found that the model estimation is about ten times faster with Julia than before, a very large improvement.

Julia’s benchmarks are available to explore, and you can see for yourself how fast the language is when compared to languages such as JavaScript and Python.

Besides speed, what else does Julia bring to the table? For one, its high-level of technical computing power is easy to learn because it shares a familiar syntax to other technical computing languages. It plays well with other languages, providing stable interfaces with big names such as C, Python, R, and Java and has powerful shell-like capabilities.

Uses for Julia

  • Data science: interact with your data with databases including MySQL, JDBC, ODBC, HDFS, and Hive
  • Machine learning: build models with automatic differentiation, GPU acceleration, and scalable machine learning
  • Scientific domains: write complex scientific tools in Julia with its rich ecosystem
  • Parallel computing: designed for parallelism and heterogeneous computing

The language’s community is fairly small, but passionate. The r/Julia subreddit has 4.4k subscribers and has links to tutorials, projects created in Julia, and opinion pieces about its strengths and weaknesses.

What’s new in v1.0?

Version 1.0 was recently released and celebrated at JuliaCon2018. Some new features that have been added since 0.6 are:

SEE ALSO: How to develop machine learning responsibly

See the release notes for a full list of everything.

That’s not all. Version 1.0 has new some external packages being built around its new capabilities. From the release announcement blog, these include:

  • The data processing and manipulation ecosystem is being revamped to take advantage of the new missingness support.
  • Cassette.jl provides a powerful mechanism to inject code-transformation passes into Julia’s compiler, enabling post-hoc analysis and extension of existing code. Beyond instrumentation for programmers like profiling and debugging, this can even implement automatic differentiation for machine learning tasks.
  • Heterogeneous architecture support has been greatly improved and is further decoupled from the internals of the Julia compiler. Intel KNLs just work in Julia. Nvidia GPUs are programmed using the CUDANative.jl package, and a port to Google TPUs is in the works.

Download the new release here and see what the fresh take on technical computing is all about!

Author
Sarah Schlothauer

Sarah Schlothauer

All Posts by Sarah Schlothauer

Sarah Schlothauer is an assistant editor for JAXenter.com. She received her Bachelor's degree from Monmouth University in Long Branch, New Jersey and is currently enrolled at Goethe University in Frankfurt, Germany where she is working on her Masters. She lives in Frankfurt with her husband and cat.