Bloomberg used 1.3 Million GPU hours and 600 Billion documents to train BloombergGPT

Bloomberg used 1.3 Million GPU hours and 600 Billion documents to train BloombergGPT

WHY THIS MATTERS IN BRIEF Creating even “small” AI models takes up extraordinary amounts of computer power, but as computers get more powerful and cheaper, and as AI learning algorithms get more efficient this “effort” will drop.   Love the Exponential Future? Join our XPotential Community, future proof yourself with...

Read more