Facebook has posted a job opening looking for an expert in ASIC and FPGA, two custom silicon designs that companies can gear toward specific use cases—particularly in machine learning and artificial intelligence.
There’s been a lot of speculation in the valley as to what Facebook’s interpretation of custom silicon might be, especially as it looks to optimize its machine learning tools—something that CEO Mark Zuckerberg referred to as a potential solution for identifying misinformation on Facebook using AI.
The whispers of Facebook’s customized hardware range depending on who you talk to, but generally center around operating on the massive graph Facebook possesses around personal data. While a camera might have a set of data points as a series of pixels, Facebook’s knowledge of you goes well beyond your list of friends and down to minute preferences you have—a set of data so large that it demands a new approach to speed up the process.
Most in the industry speculate that it’s being optimized for Caffe2, an AI infrastructure deployed at Facebook, that would help it tackle those kinds of complex problems. Customized silicon generally tends to be around optimizing inference (the “is that a cat” part of machine learning) or machine training (“this is what a cat is”). On either end, it’s based on speeding up relatively simple math operations based in a field called linear algebra. But we’ve been hearing about this for a bit now, and it seems like Facebook is about to be much more overt about the process.
FPGA is designed to be a more flexible and modular design, which is being championed by Intel as a way to offer the ability to adapt to a changing machine learning-driven landscape. The downside that’s commonly cited when referring to FPGA is that it is a niche piece of hardware that is complex to calibrate and modify, as well as expensive, making it less of a cover-all solution for machine learning projects. ASIC is similarly a customized piece of silicon that a company can gear toward something specific, like mining cryptocurrency.
While the whispers grow louder and louder about Facebook’s potential hardware efforts, this does seem to serve as at least another partial data point that the company is looking to dive deep into custom hardware to deal with its AI problems. That would mostly exist on the server side, though Facebook is looking into other devices like a smart speaker. Given the immense amount of data Facebook has, it would make sense that the company would look into customized hardware rather than use off-the-shelf components like those from Nvidia.
Most of the other large players have found themselves looking into their own customized hardware. Google has its TPU for its own operations, while Amazon is also reportedly working on chips for both training and inference. Apple, too, is reportedly working on its own silicon, which could potentially rip Intel out of its line of computers. Microsoft is also diving into FPGA as a potential approach for machine learning problems.