Electrical automobile firm Tesla has launched a brand new white paper relating to a brand new commonplace for its Dojo supercomputing platform.
CEO Elon Musk teased the paper as “extra necessary than it might appear”, reviews Electrek. Within the summary, the automaker is describing a brand new commonplace to work with its computing platform.
“This commonplace specifies Tesla arithmetic codecs and strategies for the brand new 8-bit and 16-bit binary floating-point arithmetic in laptop programming environments for deep studying neural community coaching,” based on the report.
“This commonplace additionally specifies exception situations and the standing flags thereof. An implementation of a floating-point system conforming to this commonplace could also be realized solely in software program, solely in {hardware}, or in any mixture of software program and {hardware},” it added.
For years now, Tesla has been teasing the event of a brand new in-house supercomputer optimised for neural web video coaching. Tesla is dealing with an insane quantity of video knowledge from its fleet of over 1 million autos. It makes use of this video knowledge to coach its neural nets.
Over the past two years, Musk has been teasing the event of Tesla’s personal supercomputer known as “Dojo”. Final yr, he even teased that Tesla’s Dojo would have a capability of over an exaflop, which is one quintillion (1018) floating-point operations per second or 1,000 petaFLOPS.
The automaker already has its Dojo chip and tile, however it’s nonetheless engaged on constructing its full rack to create the supercomputer.
Additionally learn: