Fl ux is a machine-learning library for the multi-paradigm, fast, MIT-developed statistical programming language, Julia. Flux’s core feature is taking gradients of Julia code. In other words, Flux is able to take another Julia function and a set of arguments and return a gradient. Flux is a really cool package because it is able to do a lot of things that Google’s Tensorflow is able to do, but in a “ Developed by users” package. Additionally, Flux has the advantages of being written in Julia. While this also means that Flux suffers from the same disadvantages that other Julia packages do, the trade-of certainly isn’t terrible.
Tensorflow is a classic tool in the data-science toolbox. People who don’t program at all even know what Tensorflow is, and this is for good reason:
Tensorflow is awesome.
Tensorflow was developed under the “ Google Brain Team,” and released under the Apache license in November of 2015. Tensorflow powers the gradient models behind a lot of modern machine-learning algorithms, including those used by Google, Nvidia, QualComm, Lenovo, and hundreds more. Tensorflow has become a staple in not only Python, but machine-learning as a whole.
Flux is a great machine-learning framework because it brings with it a lot of interesting ideas and some really cool and easy syntax. There is a lot of individualism in Flux, but generally, the benefits boil down to some key factors.
As you might already know, Flux is for Julia. Being written in Julia gives Flux a massive advantage over packages written in Python. Julia is a far faster language, and in my opinion, has better syntax than Python (which is my personal preference.)
This does, however, come with a significant trade-off. Julia, for the most part, is still a relatively new language and does not have anywhere near the user-base that Python has. In this situation, Julia loses a lot of the support that an enormous language like Python carries with it. This means that while Flux might be a better solution to some extent, it might not necessarily be in every case. This is because it can be really hard to find answers to any sort of hiccups you run in to. It can be difficult to find documentation in the first place to learn how to use the language and the packages, though this is steadily improving.
I stated before that I like Julia’s syntax far more than I do Python’s. While this is purely subjective, what is not subjective is just how mutable the Julia language can be, and with that comes an entirely different and really cool way to create gradient models inside of Julia:
These are really cool, and if you want to look into Julia, you should definitely take a good look at them. They are primarily used in two particular scenarios:
f(w) = v = v + 5 for v in w
f(w) = w + 5 - 6
However, Syntactical Expressions in Julia can be used hundreds of other ways that are all interesting and make programmer both easier and more intuitive.
But how does this fit into Flux?
Flux absolutely takes advantages of these versatile expressions and uses them as the base for the entire library. Just take a look at this example from the Flux Documentation:
julia> using Flux julia> f(x) = 3x^2 + 2x + 1; julia> df(x) = gradient(f, x); # df/dx = 6x + 2 julia> df(2) 14 julia> d2f(x) = gradient(df, x); # d²f/dx² = 6 julia> d2f(2) 6
Tensorflow has its own language advantage in an entirely different way, Python is both a very universal language in that most other high-level languages are able to interface it, and also that it is based in C. The shot to the Achilles heel for Flux and Julia, however, is definitely Tensorflow and Python’s popularity.
A benefit to using anything popular; just as demand grows with supply, usability grows with users. The more people using a product creates more conversations about the product, and this is absolutely instrumental for a tool like Tensorflow or Flux to be documented.
A big issue I can presume Data Scientists will have when trying to use Flux is that a lot of things are done in a very different way than how they are done anywhere else. Flux uses a very language-specific syntax that doesn’t conform to how high-level ML code is typically typed. On the other side of the hedge, however, is Tensorflow, who takes the exact opposite approach and tends to conform to what is established inside of the machine-learning already.
Without a doubt, Tensorflow has the notoriety. This makes sense because Tensorflow has not only been in the industry longer but is also backed by the data monster Google. At the end of the day, it is far easier for a business with millions of developers that use a tool to maintain the software and improve it than a few Github maintainers who could even have day-jobs they have to go to.
I think Flux is really cool, and I enjoy working in it a lot. Tensorflow has a special place in my heart, but it would definitely be interesting to see more people using Flux for their models. Flux has some really cool ideas that it brings to the table, and while that makes it great for being unique, it also comes with a terribly unfamiliar usage. Will Flux replace Tensorflow? Most likely it wouldn’t, and couldn’t, but could Flux replace Tensorflow for you?
Julia is certainly still a baby, and lots of packages are teetering on the edge of being inclusive enough to be depended on, but regardless of its young age Julia definitely stands out as a great language for now and the future in Data Science. Who knows? In ten to fifteen years we could very likely see a surge of Julia usage, and with it, Flux usage.