New computation science centre
Jul. 16th, 2008 08:16 pmMonday's press release from DIUS mentions that fifty million pounds is to be spent on a new computational sciences centre at Daresbury, presumably to compensate for the eventual decommissioning of HPCx. And while it looks like a big chunk of the cash is going to go on a new building — the STFC page on the new centre makes this explicit — it seems reasonable to suppose that at least some of the left overs might be spent on some new iron.
And if there is money to be spent on machinery, it might be worth pondering how it might be spent because something like an SX-9 might suit actually suit them rather well. Sure, big vector is a dead technology walking, but it would give them a big shiny USP and make them unique in among the UK scientific centres. But Cray, on the other hand, might well be out of luck because it seems to me unlikely that another centre would want — or would be able, politically at least — to justify the cost of duplicating part of the HECToR facility. IBM, I suppose, are a possibility, although they may suffer because their systems are unlikely to sufficiently different to HPCx to catch the imagination.
The likes of Dell, HP etc might get lucky, but again, I suspect that these putative PC vendors are likely to suffer from their lack of uniqueness. These days, it seems as though every university has some sort of clustered system, which — excellent machines though they may well be — would seem to tarnish their appeal for a national centre that wants to stand out from the crowd.
All of this is, of course, speculation from a position of staggering ignorance, but it's still fun to imagine that vector may still rise again. Who knows, if the new Hartree Centre building takes long enough, they might just find themselves perfectly placed to buy up a glut of reconditioned SX-8s — only one careful lady owner, a full logbook and maintenance history, and, perhaps, GFS v1 thrown in for free...
So perhaps vector might ride out onto the field once again. And perhaps, by the time they've finished building their new building, they might find the market flooded with reconditioned SX-8 nodes — one careful owner, full log book, maintenance history, a nice rust free chassis and GFS v1 thrown in for free...
And if there is money to be spent on machinery, it might be worth pondering how it might be spent because something like an SX-9 might suit actually suit them rather well. Sure, big vector is a dead technology walking, but it would give them a big shiny USP and make them unique in among the UK scientific centres. But Cray, on the other hand, might well be out of luck because it seems to me unlikely that another centre would want — or would be able, politically at least — to justify the cost of duplicating part of the HECToR facility. IBM, I suppose, are a possibility, although they may suffer because their systems are unlikely to sufficiently different to HPCx to catch the imagination.
The likes of Dell, HP etc might get lucky, but again, I suspect that these putative PC vendors are likely to suffer from their lack of uniqueness. These days, it seems as though every university has some sort of clustered system, which — excellent machines though they may well be — would seem to tarnish their appeal for a national centre that wants to stand out from the crowd.
All of this is, of course, speculation from a position of staggering ignorance, but it's still fun to imagine that vector may still rise again. Who knows, if the new Hartree Centre building takes long enough, they might just find themselves perfectly placed to buy up a glut of reconditioned SX-8s — only one careful lady owner, a full logbook and maintenance history, and, perhaps, GFS v1 thrown in for free...
So perhaps vector might ride out onto the field once again. And perhaps, by the time they've finished building their new building, they might find the market flooded with reconditioned SX-8 nodes — one careful owner, full log book, maintenance history, a nice rust free chassis and GFS v1 thrown in for free...
no subject
Date: 2008-07-17 08:00 am (UTC)I am not going to comment on the future of the SX line.
However if traditional vector computing may not be on the rise, SIMD
programming definitly is.
http://en.wikipedia.org/wiki/Larrabee_%28GPU%29
no subject
Date: 2008-07-17 01:48 pm (UTC)Secondly, the current state of the vector market doesn't allow for a great deal of choice: you can have any machine you want, as long as it's an SX. This means that except where people are willing to completely re-optimise for a new platform — something that only really works when you've got a limited code base — the choice of buying and tuning for vector means effectively tying yourself to NEC. While this may be attractive to NEC, it's hard to see what it offers the customer, especial if, as mentioned above, their code doesn't currently vectorise and would need to be ported to an SX platform.
I agree that the trend towards using GPUs is interesting, but I'm not sure what impact it will have on large scale HPC. My unscientific feeling is that the people who've chosen to use GPUs are using them to accelerate small to medium scale projects, rather than using them as a replacement for grand challenge science. I suspect that this is partly due to the inertia of GC work, partly due to the lack of a standard language (although OpenCL (http://en.wikipedia.org/wiki/OpenCL) may remedy this) and partly due to the inability of large scale HPC to take advantage of any work done to optimise a code for GPU. This latter, I think, is important when you consider the way that fast scalar on the desktop (or by the deskside) helped put paid to trad vector: people wrote code for their workstation or for a network of workstations, then when these ran out of computational capacity, they were able to transfer their code relatively unaltered to a faster cluster or to a T3E or something like that. Ok, so I've oversimplied, but I still think there might be something to it...
no subject
Date: 2008-07-18 10:10 am (UTC)This is not always true but most vectorized codes can perform very well on scalar platforms.
The problem is the other way round. Unvectorized codes need to
be vectorized to run on traditional vector and this can sometimes be a big investment.
no subject
Date: 2008-07-18 02:00 pm (UTC)I'm quite willing to accept that some vector codes perform well on scalar machines. What I'm less willing to accept is this justifies the creation of vector code. Would you want to risk writing a vector application from scratch and without access to vector hardware (and without ftrace to check that your code really does vectorise) on the off chance that you might one day run on a vector machine at some point in the future?
no subject
Date: 2008-07-18 02:42 pm (UTC)And there are plenty of vector machines you can use around.
Just pick an Intel Core 2 CPU and use its SSE units. This is
vectorization.
In certain case it can give a very good speed-up.
But vector coding is not difficult, in scientific computing
this is the natural way of coding.