[PRL] fp for parallel programming even at MS

Matthias Felleisen matthias at ccs.neu.edu
Fri Aug 3 15:16:05 EDT 2007


[Note: Burton Smith is closer to Ken Kennedy and Fortran than FP.]


M'soft: Parallel programming model 10 years off

Rick Merritt
(07/23/2007 9:00 AM EDT)
URL: http://www.eetimes.com/showArticle.jhtml?articleID=201200019

Redmond, Wash. -- Multicore processors are driving a historic shift  
to a new parallel architecture for mainstream computers. But a  
parallel programming model to serve those machines will not emerge  
for five to 10 years, according to experts from Microsoft Corp.



Many new and revised programming languages are in development here  
and elsewhere that will act as key enablers for the parallel hardware  
architectures that are themselves still emerging, according to Burton  
Smith, a parallel computing guru who overseas research in the field  
at Microsoft.



"There is a fundamental change in computer architecture coming,"  
Craig Mundie, chief research and strategy officer at Microsoft, said  
in an interview with EE Times. "I personally think this is one of the  
most disruptive things the industry will have to go through."



Microsoft aims to help define a new programming model that also  
introduces a more formal, structured software development process.



"We would like to figure out how to make software more 'composable,'  
" Mundie said. "So if we make this transition the right way, we can  
get a twofer."



Mundie hired Smith in late 2005 to oversee research in parallel  
programming architectures. Smith had pioneered work in multithreading  
systems as chief scientist of startup Tera Computer and later at  
Cray, which acquired that startup. Smith said parallel programming,  
once confined to novelty supercomputers, is coming to mainstream  
systems.



"We need to be able to write programs that run on the next 20  
generations of Dell computers, even if the number of processors in  
them goes up by a factor of 16," said Smith. "This field won't  
continue to grow, be vital and solve society's problems unless we  
reinvent it."



The transition will take time. "The requirement to drive solutions in  
the space is on a five-year horizon," said Mundie. "So three or four  
years from now, there better be some interesting alternatives people  
are getting experience with."



At the Windows Hardware Engineering Conference in May, engineers  
revealed they were closing in on a plan to revamp the way the Windows  
Server kernel schedules threads so the operating system can support a  
greater number of simultaneous threads. But that work is just one  
small part of the puzzle.



"That's necessary, but not sufficient," said Mundie. "The server  
world has an easier problem because it is driven by load. But the  
techniques for in- creasing [thread support on servers] won't go far  
enough to support the fine-grained parallelism we need to boost  
single-core performance."



New parallel programming languages are key to the software side of  
the architectural shift. "Parallel programming languages are, I  
think, the most important issue in computing today," said Smith.



Blizzard of languages
Microsoft has at least five efforts heading in that direction, and  
many more are bubbling up from researchers around the globe. One  
researcher at Intel said he believes the most important job for  
Microsoft now is to pick one parallel language and drive it forward  
in the industry, but Smith disagrees.



"It isn't about getting to a definition as quickly as possible,"  
Smith said. "I don't think one programming language will fit all our  
needs.



"If there is a blizzard of languages, it's OK," he added. "As long as  
AMD and Intel can run a decent subset of them, it won't matter how  
many there are."



Smith noted that 20 years ago, the computer industry used a wide  
variety of general-purpose languages. Even today, developers use  
multiple languages--C++, Java, Python and others. Having three or  
four strong, popular parallel languages 20 years from now is a  
reasonable expectation, he said.



"We will need language interoperability, and .NET is one way to get  
it," Smith added, referring to Microsoft's developer framework for  
Windows.



"I am in line with Burton's [Smith's] approach," said Margaret Lewis,  
a software strategist at Advanced Micro Devices Inc. "Some people  
might argue Microsoft's time frame is a relatively fast for such a  
change, but one-size-fits-all is not something we see in software.  
There is no consistency in the languages and tools people use today."



So far, there are no parallel languages that are widely accepted and  
useful on general-purpose systems. "We have some ideas, but we don't  
have a language," Smith said.



Current thinking about parallel languages falls broadly into two  
schools. One camp emphasizes functional programming, a style that  
focuses on mathematical functions and avoids use of system state and  
variables.



Another approach is emerging around so-called atomic memory  
transactions, which group many reads and writes into blocks that are  
executed at one time in shared memory without using traditional  
locking techniques that can stall other processes.



"I believe more and more that the right way to go is to combine  
functional programming and transactional memory. They are almost made  
for each other," said Smith.



Microsoft researchers in Cambridge, England, are now building support  
for software transactions into Haskell, a functional language they  
have developed. Separately, a group in Redmond has already put some  
functional techniques into Microsoft's C# language in the form of the  
Language Integrated Query module, and they plan more work in that area.



"C# is becoming a more functional language every day," quipped Smith.  
"Building in support for atomic transactions also is in the plan."



Meanwhile, Microsoft has released a new language called F# to a group  
of beta testers. Smith described it as a very strictly typed dialect  
of the functional language Ocaml and said that F# is being employed  
for applications such as scripting.



"What products [the parallel work] will appear in is still to be  
determined. It could be C#, Visual Basic, C++ or F#," said Smith. "I  
think we ultimately will see atomic transactions in most, if not all,  
languages. That's a bit of a guess, but I think it's a good bet."



The languages also must support both shared-memory and message- 
passing schemes for interprocessor communications, Smith said.  
Separately, Microsoft is working on ways to make its legacy code more  
parallel.



Inside the chips
Before the software can be fully defined, multicore processors need  
to step deeper into the parallel future. Smith said that his focus is  
on a programming model for processors with well more than eight cores.



Today's dual- and quad-core proces-sors for PCs are still "more of  
the same," said Smith. "We are putting copies of what we use on a  
die, and that is not the right answer," he said.



In the future, CPUs will implement fine-grained concurrency. Indeed,  
they will use message-passing techniques between cores to handle  
dependencies between tasks running within a chip.



Chip designers need to experiment with a much broader range of  
multithreading techniques than are currently employed in hardware.  
And they must find ways to let tasks wait for resources without  
stalling other processes, said Smith.



"Better context switching is what we are talking about," he said.



In addition, chips will need to implement intelligent I/O blocks that  
can translate virtual addresses into physical memory locations for  
each core.



The CPUs may also have to dedicate hardware to accelerating atomic  
memory transactions. In all these areas, AMD and Intel face a period  
of more experimentation before new multicore CPU standards can emerge.



"Ultimately, all three of us [AMD, Intel and Microsoft] want  
something somewhat common," said Smith. "But there is no need to  
drive quickly to a standard."



Indeed, some Microsoft researchers are exploring the trade-offs of  
various microprocessor techniques. But the Windows giant insists it  
has no plans to turn out its own CPUs. Instead, the company intends  
to share what it learns with chip manufacturers.



"We do look at everything, but we would not be fabbing up a processor  
ourselves," said Mundie.



More information about the PRL mailing list