Julian Assange extradition appeal Google sued over handling location data Tax filing: Best tax deductions 2022 Ford Bronco Raptor Free COVID-19 test kits Yellowjackets finale recap

Techies in line for greatest Brit award

Tech pioneers who helped make computing and the Web possible have been nominated in a nationwide poll to name the greatest Brit.

Tech pioneers who helped make computing and the Web possible have been nominated in a nationwide poll to name the greatest Brit.

The father of computing, the founder of computer science and the inventor of the World Wide Web all rank among the hundred greatest Britons, according to the U.K. public.

Charles Babbage, Alan Turing and Tim Berners Lee have all been placed on the short list via a nationwide survey, conducted by the BBC. More than 30,000 people took part in the poll. The overall winner will be chosen by a public vote later this year.

The BBC revealed Wednesday the names of the 100 individuals who had attracted the most votes.

A total of 20 scientists and inventors are included--more than for any other category. Babbage, Turing and Berners Lee rub shoulders with the likes of Isaac Newton, Michael Faraday, James Watt and Stephen Hawking.

Other individuals included in the top 100 include Shakespeare, Winston Churchill, Horatio Nelson and Queen Elizabeth I.

Father of computing
Charles Babbage, who was born in 1791, is regarded as the father of computing because of his research into machines that could calculate. Babbage's Difference Engine No. 1 was the first device that could calculate and print mathematical tables.

Babbage also spent years working on a more sophisticated device, the Analytical Engine. In addition to being able to calculate sums, the Analytical Engine could also read data from punch cards, giving it a memory and the ability to make decisions based on previous calculations.

But politicians of the day did not provide the financial backing that Babbage sought, and the Analytical Engine was never completed. Its importance to modern computing, though, is illustrated by the fact that the computing language ADA was named after Augusta Ada Lovelace, the daughter of the English poet Lord Byron, who worked with Babbage on the Analytical Engine.

Founding computer science
Alan Turing was both an unlikely hero of the Second World War and a vital player in the birth of computing.

Born in 1912, Turing's research into mathematics at Cambridge University led to his famous paper, "On Computable Numbers," published in 1936. Turing showed that in theory a machine could be constructed to prove that a mathematical theorem was true and also asserted that it would be possible to create a machine, now known as the Universal Turing Machine, that could solve all mathematical problems.

By realizing that a machine could be adaptable enough to carry out a range of tasks when supplied with the appropriate program rather than being constructed to just solve one problem, Turing's vision laid the foundations for modern computing.

During World War II, Turing--an anti-war protestor in the 1930s--worked at the top-secret Bletchley Park where teams of cryptanalysts tried to crack coded German military and intelligence communications.

The German military coded their messages using Enigma machines, which were thought to be totally unbreakable. But Turing designed an electro-mechanical machine called a bombe that speeded up the decryption process, which meant that the Bletchley Park teams were able to decode many of Germany's military communications.

Winston Churchill once described Bletchley Park as Britain's secret weapon that won the war.

Weaving the Web
Tim Berners-Lee, who invented the World Wide Web in 1989 while working for CERN--the European Particle Physics Laboratory in Geneva, Switzerland--has been credited as one of the most influential people of the 20th century.

Berners-Lee in 1980 wrote the Enquire program that let him link related documents stored on his computer. These hypertext links allowed him to organize his work by "remembering" the association between two documents.

The next stage was to link to documents stored on other computers. To make this possible, Berners-Lee started work in 1990 on creating the first World Wide Web server, the first Web browser, the URL addressing system, and the HTML language used to code Web pages. By the summer of 1991, Berners-Lee's browser, called "WorldWideWeb," was available on the Internet.

Berners-Lee's work was recognized last year when he was made a fellow of the Royal Society, the prestigious British scientific body.

ZDNet UK's Graeme Wearden reported from London.