Notes On Julia – The Language

In comments about my first successful plot made with Python, I stated that were I going to change horses, it would be to Julia.

FWIW, while shutting down for the night, I did discover Julia has an interface to Python matplotlib. Were I going to swap paths now, it would be to Julia. Designed for parallel processing and about as fast as C or FORTRAN (all 3 about 10x faster than Python). Then I still could learn the same plotting library…

Late last night I was just reading about concurrent languages while waiting to fall asleep. As I’m interested in cluster computing, all things concurrent and parallel are interesting. Then, for Climate Models, that’s an interest too. For this reason I’d started looking at what language choices there were.

Turns out it is a lot…

This article lists concurrent and parallel programming languages, categorizing them by a defining paradigm. A concurrent programming language is defined as one which uses the concept of simultaneously executing processes or threads of execution as a means of structuring a program. A parallel language is able to express programs that are executable on more than one processor. Both types are listed, as concurrency is a useful tool in expressing parallelism, but it is not necessary. In both cases, the features must be part of the language syntax and not an extension such as a library.

The following categories aim to capture the main, defining feature of the languages contained, but they are not necessarily orthogonal.


1 Actor model
2 Coordination languages
3 Dataflow programming
4 Distributed computing
5 Event-driven and hardware description
6 Functional programming
7 Logic programming
8 Monitor-based
9 Multi-threaded
10 Object-oriented programming
11 Partitioned global address space (PGAS)
12 Message passing
12.1 CSP based
13 APIs/frameworks

Then follows a list of about 98 languages and 8 frameworks (like OpenCL and OpenMP). This is, IMHO, one of the plagues of modern computer science. At every turn somebody just feels compelled to “make their own language” and another one sprouts. It is impossible to even keep up with the names of them all, and just finding a good one takes too much work (and only then do you get to start the learning curve).

FWIW, I’d seen some YouTube Videos on Julia, and I’m attracted to it. Not enough to actually write anything in it yet ;-) But at least it didn’t do anything obviously stupid at first blush. From the list of 98, the one’s I’m vaguely interested in would be:

Distributed computing
Partitioned global address space (PGAS)
Coarray Fortran
High Performance Fortran
Unified Parallel C
CSP based
Main article: Communicating sequential processes
FortranM (perhaps…)

These application programming interfaces support parallelism in host languages.
OpenMP for C, C++, and Fortran (shared memory and attached GPUs)

But that is STILL too long a list to adequately examine and evaluate given all the rest I’m doing. So I’ll likely limit my FORTRAN interest to Coarray Fortran (as that is part of the language spec. now) and then some of the OpenXX stuff. Then, as time permits, a bit more looking over of the “new” “hot” “trendy” languages of Go and Julia. In particular, Julia interests me as it seems well designed for speed. That takes care and attention, which usually shows up in the rest of a language too.

I’ve bolded the bits that got my attention.

Julia is a high-level general-purpose dynamic programming language that was originally designed to address the needs of high-performance numerical analysis and computational science, without the typical need of separate compilation to be fast, also usable for client and server web use, low-level systems programming or as a specification language.

Distinctive aspects of Julia’s design include a type system with parametric polymorphism and types in a fully dynamic programming language and multiple dispatch as its core programming paradigm. It allows concurrent, parallel and distributed computing, and direct calling of C and Fortran libraries without glue code.

Julia is garbage-collected, uses eager evaluation and includes efficient libraries for floating-point calculations, linear algebra, random number generation, and regular expression matching. Many libraries are available, and some of them (e.g. for fast Fourier transforms) were previously bundled with Julia.
Notable uses

Julia has attracted some high-profile clients, from investment manager BlackRock, which uses it for time-series analytics, to the British insurer Aviva, which uses it for risk calculations. In 2015, the Federal Reserve Bank of New York used Julia to make models of the US economy, noting that the language made model estimation “about 10 times faster” than its previous MATLAB implementation. Julia’s co-founders established Julia Computing in 2015 to provide paid support, training, and consulting services to clients, though Julia itself remains free to use. At the 2017 JuliaCon conference, Jeffrey Regier, Keno Fischer and others announced that the Celeste project used Julia to achieve “peak performance of 1.54 petaFLOPS using 1.3 million threads” on 9300 Knights Landing (KNL) nodes of the Cori (Cray XC40) supercomputer (the 5th fastest in the world at the time; by November 2017 was 8th fastest). Julia thus joins C, C++, and Fortran as high-level languages in which petaFLOPS computations have been achieved.

I really like it when a language is able to keep up with the leaders in speed… As I’m running on dinky hardware, all the speed I can get matters. I won’t go into the whole language feature set and details, but a couple of important bits:

Language features

According to the official website, the main features of the language are:

Multiple dispatch: providing ability to define function behavior across many combinations of argument types
Dynamic type system: types for documentation, optimization, and dispatch
Good performance, approaching that of statically-typed languages like C
A built-in package manager
Lisp-like macros and other metaprogramming facilities
Call Python functions: use the PyCall package[a]
Call C functions directly: no wrappers or special APIs
Powerful shell-like abilities to manage other processes
Designed for parallel and distributed computing
Coroutines: lightweight green threading

User-defined types are as fast and compact as built-ins
Automatic generation of efficient, specialized code for different argument types
Elegant and extensible conversions and promotions for numeric and other types
Efficient support for Unicode, including but not limited to UTF-8

So built for speed and multiprocessing, yet can use libraries that already exist as needed / desired. Nice. It plays well with others…

Use with other languages

Julia’s ccall keyword is used to call C-exported or Fortran shared library functions individually.

Julia has Unicode 11.0 support, with UTF-8 used for strings (by default) and for Julia source code, meaning also allowing as an option common math symbols for many operators, such as ∈ for the in operator.

Julia has packages supporting markup languages such as HTML (and also for HTTP), XML, JSON and BSON, and for databases and web use in general.


Julia’s core is implemented in Julia, C (and the LLVM dependency is in C++), assembly and its parser in Scheme (“FemtoLisp”). The LLVM compiler infrastructure project is used as the back end for generation of 64-bit or 32-bit optimized machine code depending on the platform Julia runs on. With some exceptions (e.g., PCRE), the standard library is implemented in Julia itself. The most notable aspect of Julia’s implementation is its speed, which is often within a factor of two relative to fully optimized C code (and thus often an order of magnitude faster than Python or R), Development of Julia began in 2009 and an open-source version was publicized in February 2012.

Works in many environments. Though the wiki claims that for the first v6 R.Pi types you need to compile it from sources, but it’s a built in on some other later v7 & v8 boards.

Current and future platforms

While Julia uses JIT[56] (MCJIT[57] from LLVM) – Julia generates native machine code directly, before a function is first run (not bytecodes that are run on a virtual machine (VM) or translated as the bytecode is running, as with, e.g., Java; the JVM or Dalvik in Android).

Julia has four support tiers, and currently supports all x86-64 processors, that are 64-bit (and is more optimized for the latest generations) and most IA-32 (“x86”) processors, i.e. in 32-bit mode (all x86 CPUs except for the very old from the pre-Pentium 4-era); and supports more in lower tiers, e.g. tier 2: “fully supports ARMv8 (AArch64) processors, and supports ARMv7 and ARMv6 (AArch32) with some caveats.”[58] Other platforms (other than those mainstream CPUs; or non-mainstream operating systems), have tier 2 or 3 support (or tier 4 if not known to build), or “External” support (meaning in a package), e.g. for GPUs.

At least some platforms may need to be compiled from source code (e.g. the original Raspberry Pi), with options changed, while the download page has otherwise executables (and the source) available. Julia has been “successfully built” on several ARM platforms, up to e.g. “ARMv8 Data Center & Cloud Processors”, such as Cavium ThunderX (first ARM with 48 cores). ARM v7 (32-bit) has tier 2 support and binaries (first to get after x86), while ARM v8 (64-bit) and PowerPC (64-bit) have “Community” support and PTX (64-bit) (meaning Nvidia’s CUDA on GPUs) has “External” support.

Julia is now supported in Raspbian[59] while support is better for newer (e.g.) ARMv7 Pis; the Julia support is promoted by the Raspberry Pi Foundation.[60]

There’s even a bunch of folks using it ;-)

Installing Julia

The easiest way to install Julia is by downloading the 32-bit (ARMv7-a hard float) prebuilt binary from the JuliaLang website.

An older version of Julia (0.6) is also available via apt in Raspbian, we hope to update this to the latest version in the near future.

Compiling Julia

Instructions can be found over here
IJulia notebook

Jupyter will need to be installed manually, as the automatic Conda installer does not work on the ARM architecture. Generally, running

sudo apt install libzmq3-dev
sudo pip3 install jupyter

at the shell should work. Then it should be sufficient to do


at the Julia REPL.


The JuliaBerry org provides several Raspberry Pi-specific packages:

PiGPIO.jl: managing external hardware using GPIO pins.
PiCraft.jl: manipulating Minecraft on the Raspberry Pi from Julia
SenseHat.jl: interacting with the Sense HAT.

So it is on my “short list” for making the Pi Cluster “go” well. My experiments with FORTRAN on the Pi have so far been disappointing in that the parallel code runs about the same speed as the inline. I’ve not investigated why. I was going to just swap over to Coarray FORTRAN but it was not yet ready to run on the Pi… So hopefully the Julia folks have cared more and have it running well and efficiently.

At their site, they point out the graphical abilities.

Data Visualization and Plotting

Data visualization has a complicated history. Plotting software makes trade-offs between features and simplicity, speed and beauty, and a static and dynamic interface. Some packages make a display and never change it, while others make updates in real-time.

Plots.jl is a visualization interface and toolset. It provides a common API across various backends, like GR.jl, PyPlot.jl, and PlotlyJS.jl. Users who prefer a more grammar of graphics style API might like the pure Julia Gadfly.jl plotting package. For those who do not wish to leave the comfort of the terminal, there is also UnicodePlots.jl.

So lots of choices… at a time when I already have too many choices ;-) There’s also an SQL interface package. So, in theory, it has everything I want, right? Well… As a new language it is possible it will change fast and I’m not keen on that. Also, the user base and “clue postings” are limited. All the stuff that comes with being an early adopter. As I’m already an early adopter on my OS choice (Devuan) I’d rather some bits of the stack were more staid and proven. So I’m sticking with Python for now as it is the most widely used for what I’m interested in doing.

Still, at some point, I may get enough “play time” to play with Julia and data plotting. It’s a low priority “play time” thing, though. Work time will continue going into Python & matplotlib learning. Not wasted as Julia can also do the matplotlib stuff ;-)

Note this bit:


Julia has been downloaded over 4 million times and the Julia community has registered over 2,400 Julia packages for community use. These include various mathematical libraries, data manipulation tools, and packages for general purpose computing. In addition to these, you can easily use libraries from Python, R, C/Fortran, C++, and Java. If you do not find what you are looking for, ask on Discourse, or even better, contribute!

So while it is a young language, they look to have taken time to assure they are a compatible language.

The Julia community seems to also be somewhat integrated with the Python community, so I expect many things to be in common, and what I learn in the Python context to easily apply.


IJulia is a Julia-language backend combined with the Jupyter interactive environment (also used by IPython). This combination allows you to interact with the Julia language using Jupyter/IPython’s powerful graphical notebook, which combines code, formatted text, math, and multimedia in a single document. It also works with JupyterLab, a Jupyter-based integrated development environment for notebooks and code.

(IJulia notebooks can also be re-used in other Julia code via the NBInclude package.)

And there’s a way to get into MySQL databases (thought julia includes DataFrames and a JuliaDB)

In Conclusion

Were I the “trendy new language” sort, I’d likely be all over Julia as my language choice for what I’m doing. Being more from the “What has already been used 100 times so it isn’t buggy and has lots of ‘how to’ pages” sort, I’ve chosen to do my first round on MySQL / Python. I intend to continue on that path as long as it moves well. (So far it has. Despite Debian requiring a half dozen different things be installed – something the Windows and Mac install don’t require, btw – I was able to get my first graph made in about 2 elapsed days, or about 1 real day ;-)

Julia is on my very short list of likely languages to learn and use in the future. Especially given the speed and parallel / concurrent design of it, for doing big computes on small iron, that is a very attractive set of features.

Well, enough about Julia. Time for me to get back to the matplotlib mine and make a more interesting graph from the CSV data, then figure out how to use the MySQL database effectively.

Subscribe to feed


About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in Tech Bits and tagged , , , . Bookmark the permalink.

21 Responses to Notes On Julia – The Language

  1. H.R. says:

    Hmmm… I understood most of this. Part of that is due to a good write-up by you, E.M., and part of that is the descriptive bits extracted from github.

    I’ve no particular need to do any programming. I learned my Fortran 77 as a matter of course in an engineering program, but my career took a path such that I never used it again after school. All of the programming I’ve done has been to make machine tools produce what was needed or some minor fussing with PLCs. Every machine used a different program. What fun! (No sarc there. It’s fun.)

    But if I found a need to dive back into more generalized programming for data management and data manipulation, I can see why you’re enthusiastic about Julia. It seems to be designed to take this little thing or that little thing from languages everyone already uses so you don’t have to learn a new language to get that one little functionality you need.

    Please make a mental note, E.M., to write up your actual experience with Julia versus this initial impression when you do get a chance to play with it. I have an idle curiosity about whether it’s as good as its press makes it out to be and measures up to expectations.

  2. E.M.Smith says:


    Will do. FWIW, I’ve spent some of lunch watching this “Intro to Julia” video. It’s good and pretty easy.

    Covers a lot of ground. Including graphics / plotting. Looks easier than what I’ve been doing ;-) Then again, all demos do 8-{

    Has the same dynamic typing “assignment is moving a pointer” as Python, but the demo demonstrates their “copy” function that does an actual assignment, so you can choose.

    The matrix / array math is interesting in a “scary easy power” kind of way… They have a \ operator that applied to certain arrays gives the least squares result (at least I think that’s what they said…)

    At about the 1 hour mark they do a 10 M sums benchmark in Julia (function), Julia (hand written), C (hand written), Python (hand), Python (built in), Python (numpy) AND plot it all in just a couple of minutes… While Python Numpy was very fast, Python by hand was way slow. Julia was very fast either way. The whole benchmark process is an impressive display of language function too.

    Python Numpy uses SIMD (GPU instructions) when available, so you would expect it to cook. That the Julia hand written almost matches the best built in is very impressive.

    You also get to see the same code in each language…

    Hand copied results:

    Julia built in         6.8
    Python numpy           7.0
    C                     11.8
    Julia hand written    12.6
    Python built in      155.3
    Python hand written  480.3

    So yeah, Python Numpy is a really good idea… OTOH, Julia just rocks however you do it.

    After the break (about 1:20) they go into factorization and matrix math that I last used about 1/2 a century ago and I kindof glazed… so might want to skip that part if you don’t need it ;-)

    At 1:26 they start doing Eigendecompositions and Hessenberg and Schur factorizations… at this point I’m just watching because the woman doing the presenting is cute ;-) Something about the way she says “Eigen Factorization” ;-)

  3. E.M.Smith says:

    Oh, and so you won’t have to wait for my evaluation, here’s a 1/2 hour review of Julia by a professional Numerical Analysis Programmer from 2 years ago. Presented to a Python audience (so pitching it to a group who both might be most interested in it, and who would most want to compare it to Python).

    In general, I’m much more inclined to Julia than Python, but for the fact that I KNOW Python will work right on the R. Pi (as important things depend on it) and I KNOW there’s a lot of help / how-to / hints sites for it. Plus I have used it before so it isn’t totally alien ;-)

    But “we’ll see”. I’m giving myself about a week to get to the point of a few interesting graphs. If I’m hung up by then, then I’ll be looking to sample some other path. Since I’ve already made one (crappy, I admit) graph, I’m pretty sure there will be no obsticals in the Python / Pandas / MySQL path.

  4. H.R. says:

    Ah… there are two programming worlds. I was wondering why my interest in all this stuff was piqued. When I was in school, CAD was real, but not ready for prime time. I had a graphics class which was essentially programming a CAD solution before CAD existed. We were tasked with inputting the formulas that would graphically output the result of…. whatever an engineer could dream up. One exercise was to make a piston go up and down and to create a variable input field for the cam… with the math for the cam.

    It was a matter of mere months before graphics packages appeared that put paid to all that. We were tasked, as homework, to program AutoCAD outputs before there was an AutoCad. While I was still in school, crude versions of Auto-CAD-like packages were popping up everywhere, and history records that AutoCAD was the winner at that time (’80’s and ’90s).

    But you (E.M.) and others here inhabit a programming world, which is totally different from my world. As I now have spare time to cogitate on the posts and comments, I find it is stirring memories of the skilz needed to make something happen via bits and bytes and dribs and drabs in a tedious, logical sequence.

    Those machine-specific programs I referred to earlier were a much higher language than what I had learned, and at the moment, I’m not sure which I prefer. My memories are viewed through rose colored glasses. Current reality probably is a PITA but actually better than “the old days.”

    And the vocabulary………… Oh my!

    But I don’t have to do nuttin’ about nuttin’ except to read and have fun, so y’all carry on while I entertain myself on your posts and comments. Consider that a “thanks” to E.M. and all who comment on the hardware/software/programming details. As Tiny Tim said, “God bless us, everyone.” (I can’t recall that he added “And deliver us from fat fingers.” but I’m sure it’s in there somewhere. 😜)

  5. H.R. says:

    @ E.M. who wrote: “In general, I’m much more inclined to Julia than Python, but for the fact that I KNOW Python will work right on the R. Pi (as important things depend on it) and I KNOW there’s a lot of help / how-to / hints sites for it. Plus I have used it before so it isn’t totally alien ;-) “

    See? That’s sorta what I was getting at in the comment I was writing as you were posting. I don’t really have a clue , ancient memories or otherwise, about board level software and hardware relationships. I suppose I could get my mind right to program again, but what boards require to get them to do that voodoo they do…? I’d be starting at Jump Street.

    Give me a board that’s supposed to do something or other and I’ll give you a board that makes an excellent door stop, none better anywhere.

  6. Sandy MCCLINTOCK says:

    I expect you know this outfit
    … if not they are worth a look. I believe that the ggplot2 library was developed by one of the team (Hadley Wickham)

  7. Larry Ledwick says:

    I am trying to install julia on my windows desk top, install complete, and basic commands work fine but gadfly does not function and pukes out a couple pages of errors.

    Will putter with it later but it would be really nice if some of these new languages would actually install and basic essential useful functions would actually work and when you got an error you could find a solution to the problem without having to plow through dozens of broken “solutions” that don’t work, or the solutions tell you to do Foo but give you no clue how to do Foo or were to go to get info on doing Foo.
    (pet peeve rant #157)

  8. Larry Ledwick says:

    Okay I beat it into submission after scanning a bunch of web pages.

    Examples in the books I have (which I just bought) are using a deprecated syntax for some of the commands. Have to explicitly tell it what you are using before you use it

    such as:

    julia> using Pkg
    julia> Pkg.add("Gadfly")
    julia> using Gadfly

    On windows 7 you also have to upgrade to powershell 5.1
    they don’t support the native shell in win 7

    Once I got past that had to install about 4 things. Each time I ran “using Gadfly” it threw a page and a half of errors but the only one that mattered was near the top where it said :

    Foo is broke run this command to fix it:

    ERROR: LoadError: Rmath not properly installed. Please run"Rmath") and restart julia

    After doing that for :“Arpack”) etc.

    julia> Pkg.installed()
    Dict{String,Union{Nothing, VersionNumber}} with 3 entries:
    “DataFrames” => v”0.17.0″
    “Arpack” => v”0.3.0″
    “Gadfly” => v”1.0.1″

    Then I got a simple graph to display, it sends the output to the default browser
    julia> using Gadfly
    [ Info: Precompiling Gadfly [c91e804a-d5a3-

    julia> plot(x = rand(10), y = rand(10))

    Now I know the basic install is not broken and can continue beating it with a stick until it does what I want.

  9. cdquarles says:

    Hmm, Julia does sound interesting.

  10. E.M.Smith says:



    Let us know what you think of the language, once you get to that point ;-)

    Especially if I ought to sink time into using it….

  11. Larry Ledwick says:

    Okay here are some reference links folks might find helpful

    Julia tips and notes
    Julia manual <— note ref to pyplot lower right corner

    To exit the interactive session, type ^D

    If you have code that you want executed whenever Julia is run, you can put it in ~/.juliarc.jl:

    julia –color=no <— I need this to see some of their color choices

    $ julia –help

    julia [switches] — [programfile] [args…]
    -v, –version Display version information
    -h, –help Print this message

    -J, –sysimage Start up with the given system image file
    -H, –home Set location of `julia` executable
    –startup-file={yes|no} Load `~/.julia/config/startup.jl`
    –handle-signals={yes|no} Enable or disable Julia’s default signal handlers
    Use native code from system image if available
    Enable or disable incremental precompilation of modules

    -e, –eval Evaluate
    -E, –print Evaluate and display the result
    -L, –load Load immediately on all processors

    -p, –procs {N|auto} Integer value N launches N additional local worker processes
    “auto” launches as many workers as the number
    of local CPU threads (logical cores)
    –machine-file Run processes on hosts listed in

    -i Interactive mode; REPL runs and isinteractive() is true
    -q, –quiet Quiet startup: no banner, suppress REPL warnings


    julia [switches] — [programfile] [args…]
    -v, –version Display version information
    -h, –help Print this message

    -J, –sysimage Start up with the given system image file
    –precompiled={yes|no} Use precompiled code from system image if available
    –compilecache={yes|no} Enable/disable incremental precompilation of modules
    -H, –home Set location of `julia` executable
    –startup-file={yes|no} Load ~/.juliarc.jl
    –handle-signals={yes|no} Enable or disable Julia’s default signal handlers

    -e, –eval Evaluate
    -E, –print Evaluate and show
    -L, –load Load immediately on all processors

    -p, –procs {N|auto} Integer value N launches N additional local worker processes
    “auto” launches as many workers as the number of local cores
    –machinefile Run processes on hosts listed in

    -i Interactive mode; REPL runs and isinteractive() is true
    -q, –quiet Quiet startup (no banner)
    –color={yes|no} Enable or disable color text
    –history-file={yes|no} Load or save history

    –compile={yes|no|all|min}Enable or disable JIT compiler, or request exhaustive compilation
    -C, –cpu-target Limit usage of cpu features up to
    -O, –optimize={0,1,2,3} Set the optimization level (default is 2 if unspecified or 3 if specified as -O)
    -g, -g Enable / Set the level of debug info generation (default is 1 if unspecified or 2 if specified as -g)
    –inline={yes|no} Control whether inlining is permitted (overrides functions declared as @inline)
    –check-bounds={yes|no} Emit bounds checks always or never (ignoring declarations)
    –math-mode={ieee,fast} Disallow or enable unsafe floating point optimizations (overrides @fastmath declaration)

    –depwarn={yes|no|error} Enable or disable syntax and method deprecation warnings (“error” turns warnings into errors)

    –output-o name Generate an object file (including system image data)
    –output-ji name Generate a system image data file (.ji)
    –output-bc name Generate LLVM bitcode (.bc)
    –output-incremental=no Generate an incremental output file (rather than complete)

    –code-coverage={none|user|all}, –code-coverage
    Count executions of source lines (omitting setting is equivalent to “user”)
    –track-allocation={none|user|all}, –track-allocation
    Count bytes allocated by each source line

  12. Larry Ledwick says:

    Interesting read on why Julia was developed and how it assists going from an idea to working fast code very quickly.

  13. E.M.Smith says:


    That “why betting on” article is basically why I’m interested in Julia. Being able to do both C and FORTRAN like things with their speed, but in a more “user friendly” language that didn’t set its core feature set 1/2 Century ago…

    While both C and FORTRAN have done marvelous jobs of keeping up with tech trends, a good bit of it is sort of glued on. Often a well done glue job, but… Especially around parallel processing.

    Coarrays is nice, but very array focused. Great for heavy math things, not so good for spitting out 2000 text testing threads to crack passwords or check 2000 records at a time from a DataBase System for a particular content (with individual record locking so they don’t block each other).

    Then C for threads is a great thing… as long as you like being kneecaps to toes with a world of assembly and hardware abstractions that are not that abstract. Handling all the locks yourself is a pain too…

    So you end up using an OpenXXx that may or may not have been grafted on well and may or may not be efficient, and especially in the world of Arm Linux: May or may not be v6, v7 32 bit, or v8 64 bit and may or may not use SIMD / GPU computing – all depending on how lazy the person doing the port of Linux was feeling at the time… Orders of magnitude of performance difference between one end and the other… v6 32 bit soft float vs v8 64 bit hard float with SIMD. All on the same 64 bit hardware…

    So I’m hoping that Julia is more speed sensitive and that it is better ported to use the available hardware (if careful about the basic OS used…). One of the “issues” that has me not trying it yet is that the world of Arm is still struggling to get 64 bit and SIMD as default in many distributions and I have no idea how many (if any) of the Julia Libraries are in armhf or arm64 versions. LOTS of folks on first port / creation of a library only do it in AMD64 because “that’s all everyone does”… (No, Virginia, the world does not all run on Wintel boxes…)

    So “whenever” my first use comes, it will be specifically to do some simple A / B speed test comparisons of C vs FORTRAN vs Julia on arm64 and armhf builds (Odroid XU4 and Pi M3 / RockPro64) Then see if “the usual” libraries are in existence or not.

    At present, the stuff for Pi says to download from some site and {porting / installing work}… Not yet “apt-get install” per the online advice pages. So, OK, in Debian for AMD64 not yet for arm64. Got it. I’ll be back later… 8-(

    That entered into my decision to do Python for now. It’s here, now, and works. Done.

    IF I spend 6 months using it to make some graphs I want, then move to Julia and can still use the same library code and “magic graph sauce” I’m fine with that. (Then again, if suddenly I can do “apt-get install julia” and it just goes… I’ll start playing / using a lot sooner ;-)

    I’m not interested in the Windows (anything) nor in the Mac (as my machine has a dead PSU now so not running), nor much interest in the Intel / AMD64 / Linux versions (as I hate fans and have gotten very very comfortable in my no fan world now… and my AMD64 box has a loud fan…). IF it’s my only choice, well, I’ll fire it up in a coupe of months for a test drive of the language.

    But really, I think it will make it to the active Debian arm64 build pretty quickly. There’s some large makers of massively parallel boxes using arm chips now… We’re talking Cray and others that sell to big TLAs and Universities; so I expect it will be ported fast as that is the core user base for things like Julia. I’m guessing some time in the next 6 months, but who know, maybe done already for all I know…

    Package: julia (1.0.3+dfsg-4)

    high-performance programming language for technical computing

    Julia is a high-level, high-performance dynamic programming language for technical computing, with syntax that is familiar to users of other technical computing environments. It provides a sophisticated compiler, distributed parallel execution, numerical accuracy, and an extensive mathematical function library. The library, mostly written in Julia itself, also integrates mature, best-of-breed C and Fortran libraries for linear algebra, random number generation, FFTs, and string processing. Julia programs are organized around defining functions, and overloading them for different combinations of argument types (which can also be user-defined).

    This package provides a complete Julia installation (JIT compiler, standard library, text-based user interface).


    root@odroidxu4:/SG2/ext/chiefio/SQL/bin# apt-get install julia
    Reading package lists... Done
    Building dependency tree       
    Reading state information... Done
    Package julia is not available, but is referred to by another package.
    This may mean that the package is missing, has been obsoleted, or
    is only available from another source
    E: Package 'julia' has no installation candidate

    So not yet in the Arm World… Present in Sid for the rest of the AMD64 world though…

  14. Larry Ledwick says:

    That is what I am hoping that as the new “hot buzz” language it is more likely to get ported to new hardware than some others if you have a lot of folks doing that sort of work. As you say both commercial and TLA groups are moving aggressively into big data applications and that means you need speed. Especially when you are dealing with tables with more than a billion rows in them, you end up with very big selects and lots of data to handle to get to the final output.

  15. jim2 says:

    Fan silencers …

    quiet fan with hydraulic bearings …

    tactics for quieting a badass PC …

    I’m wondering if a system of exhaust pipes with the fan at the outlet (and the outlet away from where I work) would work also. My computer fans are a bit noisy too.

  16. jim2 says:

    The databases used by “big data” like facebook, for example, ain’t yo daddy’s database.

  17. H.R. says:

    @jim2 – My son is a serious gamer. He stays on the cutting edge of speed, particularly as it applies to graphics. His game monitor is 60″+/- and he has the latest stuff seriously overclocked. He’s into watercooling. The hell with lousy fans. πŸ˜œπŸ˜†
    I am waiting for one of those heat exchanger/cooling ponds they have at malls to appear on his condo grounds.
    I’m not really kidding all that much, except maybe the cooling pond, although…..

  18. Larry Ledwick says:

    Well I gave up finding a list of commands supported by Julia, apparently no one thinks it is useful, so I did the next best thing and made my own.

    Using the help function I first did a help on every letter of the alphabet

    help>? a
    When you do that Julia gives you a list of terms it thinks you might be interested in, separated by spaces.

    Like so
    help?> a
    search: any all abs Any axes atan asin asec any! all! acsc acot acos abs2 atanh atand asinh asind asech asecd ascii angle acsch acscd acoth acotd acosh acosd Array ARGS atexit argmin argmax append! adjoint

    Couldn’t find a
    Perhaps you meant !, %, &, *, +, -, /, :, , \, ^, |, ~, Γ·, Ο€, β„―, ∈, βˆ‰, βˆ‹, ∌, ∘, √, βˆ›, ∩, βˆͺ, β‰ˆ, ≉, β‰ , ≑, β‰’, ≀, β‰₯, βŠ†, βŠ‡, ⊈, βŠ‰, ⊊, βŠ‹, ⊻, abs, all, any, NaN, Val, cat, map, max, tan, fma, isa, !=, // or <:
    No documentation found.

    Binding a does not exist.


    Then after a little application of sed to strip out spaces and remove empty lines then running that output through sort and uniq, you end up with a file of 812 words that Julia returns with the help function.
    (presumably these are all the commands it recognizes, but I cannot say that with absolute certainty)

    The final sorted and stripped unique list is 812 terms –

    That is sort of a long list to post here but I have it if anyone wants it, might be manageable if put into a document file as several columns of words you could scan through to look for familiar commands.

  19. E.M.Smith says:


    I’ll be interested in it eventually… For now you could just post your sed / uniq script…

  20. Larry Ledwick says:

    I did it as single commands writing to files to make sure I got what I wanted.
    I get interrupted all the time due to the nature of my job, so tend to do this sort of project by breaking things down into discrete steps and write each step output to slightly different named file so if necessary I can drop back to a previous step if I muck something up.

    Strip spaces out of file and replace with new line

    sed s/” “/\\n” “/g file_orig > file_out_1

    Remove leading spaces in lines

    sed s/^” “/””/g file_out_1 > file_out_2

    Remove blank lines from a file
    sed ‘/^$/d’ file_out_2 > file_out_3

    mv file_out_3 julia_command_list_stripped

    Then I sorted the file
    sort ./julia_command_list_stripped >./julia_command_list_sorted

    cat ./julia_command_list_sorted | uniq > ./julia_command_list_sorted_uniq

  21. Larry Ledwick says:

    I was talking to the head of IT here at work last night about Julia and he asked me to look around and see if Julia supports the advanced vector extensions that Intel chips have to improve compute performance.

    This sure looks to me like the Julia compiler will write code that uses the Vector extensions
    Looks like Julia has been supporting AVX code since Aug. 20, 2014

Anything to say?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.