Table of Contents
Just about every chip maker is going through some sort of revolution. Things are changing fast. nVidia just bought ARM, AMD has been mulling over purchasing Xilinx, and Chinese chip manufacturers are buying modern fabrication equipment. The world is transitioning to 10nm and beyond, and machine learning is here. It’s gonna take quote a bit of software development, though, to keep up with the GPU power-house that is nVidia.
Linley Gwennap, a 20-year semiconductor analyst veteran, says that software is still a major challenge that a company that seeks to compete with nVidia faces. If they want to play on nVidia’s level in the AI realm, they have a lot of software they are going to have to write.
Although several chip vendors and cloud-service vendors have developed impressive hardware for AI acceleration, the next hurdle is the software.
Linley Gwennap
Gwennap had this when referring to nVidia’s competition:
Software is the hardest word. To compete against deep software stacks from companies such as Nvidia and Intel, these vendors must support a broad range of frameworks and development environments with drivers, compilers, and debug tools that deliver full acceleration and optimal performance for a variety of customer workloads.
Linley Gwennap
He went on to explain that other companies don’t completely support popular machine learning frameworks, such as Google’s TensorFlow. Also, he explained that a lot of nVidia’s competitors have chips that some AI models might not even properly compile for.
nVidia’ got dem GPUs
nVidia is at the very top of the machine learning ladder. Team Green’s CUDA software has given it a lead on the order of a decade. Also, you have to consider that machine learning models are highly parallel workloads and they require, ya know, graphics processors. NEWS FLASH: nVidia makes graphics processors. EVEN NEWSIER FLASH: They make the best ones. NEWS SO FLASHY YOU CANT EVEN SEE IT: nVidia gets infinity free nVidia graphics processors. The machine learning industry surely needs a little more competition, but like, good luck with that.
Machine learning is making its move from huge data-centers to your pocket. The models are becoming more efficient and the computer in your pocket is getting faster by the minute. Businesses like Imagination and Think Silicon are making pushing the low-power AI envelope. Now-a-days, AI can be run on a tiny, low-power micro-controller.
nVidia Raises The Stakes
The stakes are getting pretty high. As mentioned before, nVidia is buying ARM for $40 billion dollars. For those of you that do not know, ARM is who makes the processors for almost all smartphones and tablets.
*stares at the 3 Intel atom based phones/tablets that are crying in the corner*
Anyone that might be trying to take market-share from nVidia is gonna have to duplicate (good luck) nVidia’s software stack. The thing is, no one really even has the hardware to write that software on to come close. Well, I mean, AMD could. But they are too busy dominating Intel.
Not only are these ARM processors in your pocket, but you can also find like 12 of them in your computer monitor. Another 14 in your TV. There are more than likely at least 2 in your blender. There are definitely a dozen or so ARM chips powering your Dyson vacuum. So, nVidia’s ecosystem is getting a huge boost.
There are some open efforts that are going on, they are not getting a huge amount of traction. It’s been up to most of the companies to develop their own alternatives, and that’s why it’s been taking so long.
Linley Gwennap
Also: AN810-XNX: Aetina’s new nVidia Jetson-based platform