Build a National Canadian Supercomputer Centre
Theme: Digital Infrastructure
Idea Status: +21 | Total Votes: 37 | Comments: 8
Much of the discussion around the digital economy has focused on digital communications and digital content rights. This covers access to and ownership of digital content. We also need to consider the digital technologies for the discovery of new information. I'm not talking about more opinion blogs, or more 'how to' videos, rather the discovery of new information. Science has in many instances gone digital, with the discovery of new knowledge based on numerical methods. These digital discoveries are uncovered in supercomputers.
Many advanced nations use national supercomputing facilities to enhance the competitiveness of their economies. We compete with these economies, but the supercomputers that are built in Canada are largely restricted to universities, and are not built to a comparable scale. If we want to be taken seriously in the global economy as an innovative nation, a world class supercomputing facility should be built in Canada that is a collaboration of industry, government and universities. If this is our "moon shot", then we need to think big.
How big? Based on the size our economy relative to the US, Canada should already have a PetaFLOP scale machine in the planning stage. The largest supercomputer in Canada is ranked 22nd in the Top500 supercomputers in the world, and is less than a third of a petaFLOP in theoretical peak performance. The US already has 3 supercomputers above a petaFLOP, Germany and China have the other two. Russia is expected to upgrade it's largest machine into the petaFLOP range this year. The US is planning on bringing online a 10 PetaFLOP system next year.
The first teraFLOP machine came online in 1997, and the first PetaFLOP machine came online in 2008. In Europe and the US, planning has already begun for ExaFLOP machines. Using the last two data points, that should be feasible around 2019.
Thus for Canada's digital moon shot, given that it takes years to plan and build these "grand challenge science" machines, I would suggest the following targets:
- 1 petaFLOP machine in 2012
- 10 petaFLOP machine in 2015
- 100 petaFLOP machine in 2017
dsanden — 2010–05–13 10:24:51 EDT wrote
An EU grid computing initiative: BEinGRID
Ron Van Holst — 2010–05–15 21:38:47 EDT wrote
Thanks for the link, it has great examples of industrial applications.
Ron Van Holst — 2010–06–24 08:26:51 EDT wrote
I found this interesting graphic Mashable.
It shows supercomputing capacity per country by relative area on the graph based on the latest Top500 data. Canada has roughly the capacity of Sweden, yet our economy measured by GDP is about twice the size. If we look at another country, for which we are vying over arctic sovereignty, Russia, they are considerably ahead in supercomputer infrastructure, and their economy is roughly the same size. To follow this track, management of arctic resources will play a big role in future wealth, and this means digital management. I'm not just talking about exploitation, but responsible use where risks are properly managed using numeric methods, take the simulation of the BP oil spill Mashable (cool animation) this should have been done ahead of time. Not only should we manage our arctic management risks digitally, but these types of models could be used to hold the other arctic nations accountable for their activities, as a mistake by one would be disastrous to all. This is an area where Canada could show leadership, the technologies to build the supercomputers and the expertise to model climate, ocean currents, weather, oil flows etc.. all exist. Private enterprise do not properly model their risks, they are too driven by quarterly profit.
robert_saric — 2010–07–06 10:19:38 EDT wrote
The advances in mainstream computing brought about by improved processor performance have enabled some former supercomputing needs to be addressed by clusters of commodity processors — this doesn't mean we should become complacent. The government is the primary user of supercomputing and in Canada we need to maintain our level of achievement in supercomputing and its applications, as well as to keep us from falling behind relative to other nations and to our own needs, a renewed national effort is needed. Good initiative Ron.
Ron Van Holst — 2010–07–06 12:05:45 EDT wrote
I wrote a blog post (Ron Van Holst) with some examples of supercomputer use. In many cases, excellent work is already being done in Canada, but we should scale up most of these efforts, as well as invest in new areas of supercomputer research for Canada.
slimsamu — 2010–07–07 08:06:06 EDT wrote
Excellent proposal Ron. Canada does indeed deserve and require such facilities in order to be highly competitive on the world stage. Bravo.
couchman — 2010–07–12 14:34:11 EDT wrote
This idea meshes well with the excellent submission and idea by Compute Canada. You might also be interested in the analysis on pp 5/6 of Sharcnet
Ron Van Holst — 2010–07–13 12:54:30 EDT wrote
Thanks Hugh for the newsletter, I did find it very interesting. I guess I'm adding a voice outside the academic community for High Performance Computing in Canada. I don't think we should be shy about shooting for a top 10 system in the Top500. I'm sure Canadian researchers, both in the public and private sectors will put it to very good use. We'll then be taken seriously as an innovative nation on the world stage, and I have no doubt that it will pay the Canadian economy back many times over.
One more voice for Supercomputing in Canada,
Ron Van Holst.