May 15, 2020
Over the last couple of days, I spent a lot of time reading into and experimenting with different AI/ML frameworks. The big dogs on the block are TensorFlow from Google and Torch from FaceBook.
Technically, tensorflow is a pain to install on my G7 HP servers. My Torch experience was much smoother.
Both frameworks rely on tensors, which I understand as multidimensional matrices.
I connect all of what I read to my prior knowledge, going back to my first degree in Computer Science, where I played a lot with neural networks - I also continued playing with NNs since then. Back in 2000, one of my diploma thesis supervisors told me, he knows the theory behind NNs, but he really can’t see a use case for them … he was a PhD in economics.
Back to today, I conclude that the real shift to mainstream machine learning emerged with the widespread availability of GPUs and, more recently, dedicated tensor chips. It’s really impressive how much speed up you can get with GPUs over plain CPUs.
In short, having played with Torch for a couple of days now, I can really see how omnipresent neural networks will become in the near future. A whole lot of new business models will emerge, I can definitely see that we’ll be able to categorize more data than ever before, putting data into relations.
A future post could inquire into AI/ML business models.