0:00
this is dips a large language model
0:03
developed by a Chinese AI lab of the
0:04
same name DPS is sending shock waves in
0:07
the AI industry training AI models is
0:10
very expensive open AI anthropic meta
0:13
spend more than $100 million just on
0:16
compute dsck managed to do it with just
0:18
5 million and their model Mar or perform
0:21
GPT 4 and Crow dsck managed to do this
0:24
by rethinking everything from the ground
0:26
up instead of writing every number 32
0:30
they used eight decimal places saving
0:32
75% of memory in the process then they
0:35
also implemented the multi token system
0:38
which is fast than 90% accurate they
0:40
also built expert systems that only wake
0:43
up when needed this approach makes API
0:46
is cheaper reduces training Cost needs
0:48
less gpus and models can run on gaming
0:51
gpus and best of all deeps is open
0:54
source out there in the public for
0:56
everyone to see subscribe for more