Facebook has announced that it wants to share its work on artificial intelligence and machine learning with the world. Well, share more of its work anyway. The company has already made the software it uses open source with the Open Compute Project, and will now also be publishing the designs for its hardware as well. This won’t be much use to regular consumers, but is extremely valuable for researchers working on AI.
The design in question is called Big Sur, and incorporates eight PCI-e GPUs of up to 300 watts each. It was built with Nvidia’s Tesla M40 GPUs in mind, but can accept any commercially available graphics card. The architecture also uses Nvidia’s Tesla Accelerated Computing Platform to make the processing work more efficient. Facebook has also made the Open-Rack compatible hardware easy to service; mostly by removing parts that are rarely used and making components that need to be changed often easily accessible.
Big Sur is currently in use at Facebook, doing quite a bit of heavy lifting. The AI is able to “read stories, answer questions about scenes, play games and even learn unspecified tasks through observing some examples”. In other words, it has proven itself to be very good at doing what Facebook wants it to do.
Not everyone is going to be able to go out and build one of these designs for themselves. It is mainly aimed at AI researchers who want a system that can be cobbled together from off-the-shelf parts.