# my net house

WAHEGURU….!

## Engineer’s Guide to Julia Programming

December 12, 2017

Posted by on ### Engineer’s Guide to Julia Programming

Finally the moment has come when I can say that I can be productive as well as my solution can be Parallel,Optimize-able,Customizable and at last but not least glue-able. Yes those are the fantastic features I believe one can rely on while Learning any New Programming language and Developing a Very High Quality AI/ML Embedded Software Solution.

Why?

### Julia Solves Two Language Problem.

**Important Disclaimer for Newbies: **I am Pythonista by choice and over the last few years I have Developed Projects using Python and it’s sister technologies to provide the solutions those are related to

Automation(Python -Scripting)

Web-Development(Django,Flask,Sanic,Tornado)

Data Analysis(SageMath,Sympy,Paraview,Spread-Sheets,Matplotlib,Numpy,Scipy,SKLearn)

Quantitative-Analysis(Quantopian.com)

3D Modeling(FreeCad, BIM,IFC), and Cluster Computing(Rock’s Cluster).

**Now I just wanted a tool that would allow me to write Pure-mathematical expressions(using required signs not variable names) and write Machine-Learning/Artificial-Intelligence/Deep-Learning code where I would find myself on core layer of abstraction** not like Tensor-Flow, Pytorch, or Numpy/Pandas. I am not against these libraries those helped me **“soooo”** much over the years but I have no idea that what kind of things are happening under the hood and may be I will never be allowed to change the working internals of Numpy/Pandas/Cython or anything that is related to Scientific Python only because there could be large amount of Fortran/C++ or Pascal kind things and crunching numbers as well.

Stuff that an Engineer need to perform for various kinds of jobs in Julia-Programming Language can be described as follows:

Solving a Simple Mathematical Equation in Julia:

`A = randn(4,4)`

x = rand(4)

b = A*x

x̂ = A\b # here we have written x-hat Symbol

println(A)

println(x)

println(x̂)

@show norm(A*x̂ - b)

Doing Matirx Operations in Julia

`A = randn(4,4) |> w -> w + w' # pipe A through map A -> A + A' to symmetrize it`

println(A)

λ = eigmax(A); # have you checked the lambda?

@show det(A - λ*I)

**Performing Integration:**

Performing integration might be one of the most important task one would be doing in Day to day if someone is involved with problems related to modeling and designing a solution using CAS(Compute Algebraic System) like Matlab or Sage-Math but designing a solution using CAS and then finding various ways to implement it into production is kind of “LOt of WoRk” I assume that only come with Either Experience or Lots of Extra Brain cells. 😉 See here Julia Plays an important role: **“Solving two Language Problem”**.

# Integrating Lorenz equations

using ODE

using PyPlot

# define Lorenz equations

function f(t, x)

σ = 10

β = 8/3

ρ = 28

[σ*(x[2]-x[1]); x[1]*(ρ-x[3]); x[1]*x[2] - β*x[3]]

end

# run f once

f(0, [0; 0; 0])

# integrate

t = 0:0.01:20.0

x₀ = [0.1; 0.0; 0.0]

t,x = ode45(f, x₀, t)

x = hcat(x…)’ # rearrange storage of x

# Side-Note::: What … is doing in Julia? (Remember *args and **kwargs in Python?)

# for more see:goo.gl/mTmeR7

# plot

plot3D(x[:,1], x[:,2], x[:,3], “b-”)

xlabel(“x”)

ylabel(“y”)

zlabel(“z”)

xlim(-25,25)

ylim(-25,25)

zlim(0,60);

**Really interesting Dynamic Type System()::**

This is one of the most interesting part for me to have so much fun with Julia and it’s GREAT! Type System, You know why? **Because It knows how long that bone is and how much Calcium will be there:**

**### Built-in numeric types**

Julia’s built-in numeric types include a wide range of

1. integers: Int16, Int32, Int64 (and unsigned ints), and arbitrary-precision BigInts

2. floating-points: Float16, Float32, Float64, and arbitrary-precision BigFloats

3. rationals using the integer types

4. complex numbers formed from above

5. vectors, matrices, linear algebra on above

**Ok let’s Have The Fun!**

I encourage you to run following code into Jupyter Notebook that is running With Julia-Kernel.

`π`

typeof(π)# it will return irrational. Beacuse pi is irrational Number? 😉

**Let’ Hack Julia’s Type System on Much deeper level!(Because it is much more than classes)**

What else we need to know about it?

Define new Parametric Type in Julia:

`,y::T`

type vector_3d{T<:Integer}

x::T

end

# can we call x any as Data-Members as like as C++ Data-Members?

type_call = vector_3d{25,25} # this is how we call it.

Let’s Just make Types more interesting: (and immutable)

immutable GF{P,T<:Integer} <: Number

data::T

function GF(x::Integer)

return new(mod(x, P))

end

end

### Deep Learning and Machine Learning in Julia:

In the real eye Julia is developed to write “Mathematical Functions” by just using Native Language Syntax. It is more like if you want to do linear regression rather than installing a New_library and calling inbuilt Linear function of that library those could be written in C, C++ or Fortran may be or More or less Optimized Cython-Python Magic. But Julia responsibly provides static inbuilt and Really fast code methods to write your Own linear regression as easy as Python and as Fast as C++/Fortran.

Available Machine-Larning Packages in Julia:

**Scikit-Learn in Julia:**

ScikitLearn.jl implements the popular scikit-learn interface and algorithms in Julia. It supports both models from the Julia ecosystem and those of the scikit-learn library (via PyCall.jl).

https://github.com/cstjean/ScikitLearn.jl

**Text Analysis in Julia:**

The basic unit of text analysis is a document. The TextAnalysis package allows one to work with documents stored in a variety of formats:

*FileDocument*: A document represented using a plain text file on disk*StringDocument*: A document represented using a UTF8 String stored in RAM*TokenDocument*: A document represented as a sequence of UTF8 tokens*NGramDocument*: A document represented as a bag of n-grams, which are UTF8 n-grams that map to counts

https://github.com/JuliaText/TextAnalysis.jl

**Machine-Learning Package with name Machine_learning:**

The MachineLearning package represents the very beginnings of an attempt to consolidate common machine learning algorithms written in pure Julia and presenting a consistent API. Initially, the package will be targeted towards the machine learning practitioner, working with a dataset that fits in memory on a single machine. Longer term, I hope this will both target much larger datasets and be valuable for state of the art machine learning research as well.

https://github.com/benhamner/MachineLearning.jl

**Deep Learning in Julia:**

Mocha is a Deep Learning framework for Julia, inspired by the C++ framework Caffe. Efficient implementations of general stochastic gradient solvers and common layers in Mocha can be used to train deep / shallow (convolutional) neural networks, with (optional) unsupervised pre-training via (stacked) auto-encoders. Some highlights:

https://github.com/pluskid/Mocha.jl

**Deep Learning with Automatic Differentiation:(What is automatic Differentiation?)**

Knet (pronounced “kay-net”) is the Koç University deep learning framework implemented in Julia by Deniz Yuret and collaborators. It supports GPU operation and automatic differentiation using dynamic computational graphs for models defined in plain Julia. This document is a tutorial introduction to Knet. Check out the full documentation and Examples for more information. If you need help or would like to request a feature, please consider joining the knet-users mailing list. If you find a bug, please open a GitHub issue. If you would like to contribute to Knet development, check out the knet-dev mailing list and Tips for developers.

https://github.com/denizyuret/Knet.jl

More resources on Julia Programing:

http://online.kitp.ucsb.edu/online/transturb17/gibson/

Feel free to clap and Have fun with Julia. Stay connected.