It's about fifteen times as much as Despicable Me.
In other words, 0.000002010 zettabytes is 15.20 times the amount of Despicable Me, and the amount of Despicable Me is 0.0658 times that amount.(2010) (Production data)
The 2010 digitally-animated film Despicable Me was developed by Illumination Entertainment and Mac Guff Ligne and used 0.0000001320 zettabytes. The film had a running time of 95 minutes.
It's about seventeen times as much as The Hubble Telescope.
In other words, the amount of The Hubble Telescope is 0.059 times 0.000002010 zettabytes.(a.k.a. Hubble Space Telescope, a.k.a. HST) (2008 figures)
Between its launch in 1990 and 2008, the Hubble Space Telescope gathered 0.000000110 zettabytes of images and other data about astronomical phenomena. Last upgraded during a service mission in 1999, the onboard computer of the Hubble Telescope has just 0.00000000000000180 zettabytes of operating memory (RAM) — less than most smartphones.
It's about one-twentieth as much as Mozy.
In other words, the amount of Mozy is 20 times 0.000002010 zettabytes.(2009 figures) (total file storage)
Mozy, the online data backup service, stores about 0.0000500 zettabytes of data backed up its users. Founded in 2005, Mozy's customer base has grown to 1 million personal and 60,000 business subscribers in just 5 years.
It's about twenty times as much as The LHC Data Generated per Second.
In other words, the amount of The LHC Data Generated per Second is 0.05 times 0.000002010 zettabytes.(a.k.a. Large Hadron Collider) (2008 figures)
Capturing millions of measurements per second on millions of subatomic particles, the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) facility in Geneva generates 0.0000000900 zettabytes of data every second. Data collection arrays are placed throughout the LHC's 8.6 km (5.3 mi) circular track.
It's about thirty times as much as The Google Earth database.
In other words, 0.000002010 zettabytes is 30.60 times the amount of The Google Earth database, and the amount of The Google Earth database is 0.0327 times that amount.(2006 figures) (raw imagery and indexes storage)
As of 2006, Google was storing 0.00000006570 zettabytes of raw image and index data for its satellite photo and virtual globe application, Google Earth. The application offers high resolution satellite imagery of 60% of the populated areas of the world, according to 2010 estimates.
It's about one-thirty-fifth as much as The Books in the Library of Congress.
In other words, the amount of The Books in the Library of Congress is 30 times 0.000002010 zettabytes.(2009 figures) (digitized entire collection)
The total collection of books, photographs, and other media housed by the United States Library of Congress would occupy about 0.0000700 zettabytes if fully digitized. The collection contains a total of 142,544,498 items as of 2009.
It's about 50 times as much as The Amazon.com's databases.
In other words, 0.000002010 zettabytes is 51.0032 times the amount of The Amazon.com's databases, and the amount of The Amazon.com's databases is 0.0196066 times that amount.(largest databases only; 2005 figures)
Amazon.com maintains information on the millions of items sold on it's e-Commerce website and the websites of its affiliate companies, as well as information on customer orders and browsing history, and excerpts from nearly a quarter-billion books in databases totaling an estimated 0.00000004035510 zettabytes of data. Amazon.com receives over 615 million visits to its US website each year.