Big Data Just Got Bigger

Big Data Just Got Bigger
    Watch the video

    click to begin


    Imagine you have ten truckloads of sand and you have to follow the motion of
    every grain as they dump their loads. Then a dust devil comes along and swirls
    it all into the air. Where did every grain of sand go? A physicist with
    Los Alamos National Laboratory had a similar problem;
    his supercomputer simulation needed to generate trillions of particles and
    follow their path. The Trinity supercomputer at the lab had created a
    file for every particle but that was way too many files to keep track of and the
    simulation crashed. Los Alamos super computing experts figured out a way to
    rapidly create one trillion files on Trinity. It took less than two minutes.
    That's a world record. It's a million times more files than you'd find on the
    average laptop. This gigantic file capacity allowed the physicists to
    follow every particle in its crazy path through his simulation.
    Next step? Department of Energy and Los Alamos computer scientists are working
    to create an exascale computer that can track even more particles. It could also
    follow each one's trajectory, velocity, temperature, energy, spin and other
    attributes while they're all whirling in a hurricane. And that's not just dust
    in the wind. The ability to follow so many particles is key to the labs work
    maintaining the safety, security and effectiveness of the US nuclear arsenal.
    Powering a Habitat on Mars with Kilopower The Most Beautiful Equation in Math Why Do Rivers Curve? Trinity Supercomputer Now Fully Operational How to Write a Paper in a Weekend (By Prof. Pete Carr) Probing the cosmic causes of errors in supercomputers A Beginner's Guide to the Fourth Dimension Los Alamos National Laboratory is a "No Drone Zone" Inside a Google data center Glacial moulin formation triggered by rapid lake drainage

    Post a Comment