Wednesday, October 15, 2008

Wall Street meltdown linked to 'outsourcing' of regulation to private code

- Source: IEEE Computing Newsletter, Oct, 2008; By: By Patrick Thibodeau, October 8, 2008 (Computerworld)
The full depth of IT's involvement in Wall Street's meltdown is unknown, but one plan to stop it from happening again calls upon a growing IT trend: open source.
Erik Gerding, an assistant professor of law at the University of New Mexico who researches securities law and asset price bubbles, said that in agreeing to rely on computer models, the U.S. Securities and Exchange Commission essentially "outsourced" the financial regulation to proprietary codes developed by financial services firms.
The fix? Open source the underlying codes -- much like with open-source software -- to improve the code used by financial services to calculate risk and boost transparency for regulators.
Risk models of financial services have thousands of variables and are as complicated as weather system models. They can take enormous computing power to run, which is evident in some of the spending by the industry. In 2003, financial services firms spent $169 million on high-performance systems. By 2007, research firm IDC said, spending reached $305 million.
As capability and power of the technology increased, so did the complexity of the investments, such as with mortgage securities. Regulator confidence increased in these tools so much so that the SEC made a fundamental change in how to regulate financial services. On April, 28, 2004, the SEC -- at the insistence of financial services firms -- loosened its capital rules, agreeing to rely on financial services' computer models in assessing risk. This meeting, which received scant attention at the time, was recently the focus of a New York Times report.
In a draft paper made available last month, Gerding said regulators may have been comfortable with this increased level of risk because they saw lenders on the hook for the loss. "Lenders and the financial markets, many regulators assumed, accurately priced and managed this risk due to advances in the risk models -- a type of code -- they employed," Gerding wrote.
The SEC was also acting at a time when confidence in technology was high. If there was any reason to be worried about the SEC's action, it wasn't evident in a speech Allan Greenspan, former chairman of the Federal Reserve, delivered in 2005. He said technology and new credit-scoring models gave lenders a means "for efficiently extending credit to a broader spectrum of consumers."
So, who comprised this "broader spectrum" of consumers? Subprime mortgage borrowers. "Where once more-marginal applicants would simply have been denied credit, lenders are now able to quite efficiently judge the risk posed by individual applicants and to price that risk appropriately," said Greenspan, who stepped down in 2006. But Greenspan's assertion that lenders were technologically enabled to judge the risk was wrong. RealtyTrac of Irvine, Calif., estimates that as many as 1 million people will lose their homes this year due to foreclosure. In 2007, 400,000 people lost their homes. Late last year, Greenspan increased his forecast of a recession.
The financial impact of the bad loans rose up through the financial system. Once the mortgage originator sold the loan, it was then sliced up into other investments, making any understanding of the risk progressively worse. The human elements, such as the number of so-called "liars' loans" (loans based on unverified income), weren't, apparently, part of the risk models. These problems are expected to increase investment in risk management.
Gerding said he believes the answer is to open source the financial codes, which would let the banks and rating agencies see the actual code for models that are used to set capital requirements and how the risk was assessed.
"Just as with open-source software, other users would then be able to copy and modify these models for their own use," said Gerding, who noted studies supporting the premise that open source is less prone to bugs.
Wall Street firms are already major users of open-source software. But Lisa Cash, executive vice president of sales and marketing at DFA Capital Management, a company that develops financial codes, said it will be very difficult to get high-quality products out in the market. "Who would actually spend the money to do it?" Cash asked.
Unlike U.S. regulators, European counterparts audited the risk models, Cash said, which increased transparency and confidence in them. The more transparency, "the better it is for our business," she said. European regulators look at the codes, but agree not to disclose them.
But others suggest that improving transparency will not be as easy as simply opening up code. Peter Teuten, president and CTO of Keane Business Risk Management Solutions, said there were neither common standards in risk management, nor anything like a "universal stress test" for more than a basic set of risk scenarios.
Teuten questioned using open source as a model because it may lead to multiple, noncentralized development. He said he does, however, see standards emerging from the current current crisis.

Monday, October 13, 2008

Technology that powers the Phoenix Mars Lander

- Source: EDN Asia Newsletter, Date: 01, Oct, 2008


The Phoenix lander’s 90-day mission at the Mars pole is to gather dirt and rock samples with its robotic arm, analyze the samples with onboard instruments, and communicate results and respond to commands with its earthbound project engineers. All of these tasks require electrical power. The power-generation, -regulation, and –delivery system for the Phoenix comprises a lithium-ion-battery pack from Yardney Technical Products and two solar arrays from ATK Space Systems. The oncoming Martian winter constrains the mission to a tight, 90-day window. Temperatures then drop to the point at which the atmosphere, which is more than 95% carbon dioxide, freezes solid and shrouds the lander in dry ice. The Phoenix will then shut down for the winter—and, most likely, forever.

Each array unfolds like an oriental fan into a circular shape 2.1m in diameter and can generate 770W of power from sunlight at the distance Earth is from the sun. Because Mars is approximately 1.5 times farther from the sun, the solar arrays will produce less than half the power possible on Earth.
Solar arrays are the primary power source for the lander. The arrays’ gallium-arsenide crystalline-photovoltaic cells have an efficiency of 27%. Developers grew the cells on a germanium substrate and then bonded them to a flexible substrate structure. The cells have the maximum photovoltaic-conversion efficiency available given the spectral content of sunlight on the Martian surface.
The lithium-ion battery comprises two identical battery modules as well as the electronics to monitor cell voltages, control charging and discharging, and perform cell balancing. The two modules form a V shape measuring 13.3831039.5 in. and weighing 17.8 kg. Each module’s maximum continuous-output current is 12A at 28.8V, with an ampere-hour capacity of 33 Ahr and an energy-storage capacity of 950.4 Whr.
The Phoenix batteries will provide power at night when there is no sunlight for the solar panels to convert to electricity. The lander can also use the batteries whenever a task requires more power than the solar arrays can deliver. The battery fits inside a thermal enclosure, insulation surrounding it, on the component deck on the underside of the lander.
Just how cold does it get at the poles during a Martian winter, when the carbon dioxide atmosphere freezes solid? The atmospheric pressure fluctuates but, at its highest, is 100 times less than Earth’s, where the freezing temperature of carbon dioxide is 278.58C at 1 atm (atmosphere) pressure. At Mars’ lower atmosphere, carbon dioxide freezes at 21258C (21938F).

Sunday, October 12, 2008

Mobile Industry Continues to Boom Despite Turmoil in World Financial Markets

- Source: Electronics Manufacturing (EM) Asia, Date: 3 October 2008
Despite turmoil in world financial markets over the last year the trillion USD mobile industry continues to confound expectations with spectacular accelerating growth. A new report from Portio Research reveals that over half the world now uses a mobile phone and predicts that 80 percent of the world’s population will be doing so by the end of 2013 - a staggering 5.8 billion people. The report, ‘Worldwide Mobile Market Forecasts 2009 – 2013’ provides a comprehensive analysis of worldwide mobile markets, growth forecasts plus network operator and handset vendor market shares.Among the top 20 growth markets ranking list (2007-2013) there are few surprises. China wins the top spot, just ahead of India. These two countries are expected to contribute over 1 billion additional subscribers during this time. Brazil comes in a distant third with 132 million additional subscribers over the same period. Africa, the Middle East and Latin America, are also expected to experience high growth estimated at CAGR 13.3 percent, 10.7 percent and 9.9 percent, respectively. Meanwhile despite rising worldwide mobile voice and data revenues Mobile ARPU continues to decline and is predicted to fall from USD 23.2 in 2005 to USD 15.8 by the end of 2013, largely because additional subscriber growth is likely to come from low per capita income markets.‘Worldwide Mobile Market Forecasts 2009 – 2013’ provides a comprehensive breakdown of mobile handset market share, with news that Nokia is still king shipping over 437 million handsets during 2007, while Samsung has displaced Motorola from the number 2 spot. In the first two quarters of 2008 LG displaced Sony Ericsson from the number 4 spot.A substantial piece of research spanning over 350 pages ‘Worldwide Mobile Market Forecasts 2009 – 2013’ provides an overview of the world’s major regions plus detailed analysis of 73 individual country markets. Each country section includes a mobile market overview with discussions on subscriber growth and penetration and takes a look at the competitive landscape highlighting any recent developments. “In 2006 we predicted that over half the world would be using a mobile phone by 2009 thought by many to be wishful thinking at the time,” said John White, Business Development Manager at Portio Research. “Despite this the mobile industry achieved this milestone early in 2008 and continues to be a beacon of good news amid the daily gloom and doom of the last year. This report suggests that this will continue”, remarked White.
Link:http://emasiamag.com/article-4484-mobileindustrycontinuestoboomdespiteturmoilinworldfinancialmarkets-Asia.html

Friday, October 10, 2008

Airtel launches satellite TV to grip DTH arena

- Source: SiliconIndia News, Date: Friday, October 10, 2008
Bangalore: After Reliance's venture into the DTH arena, it's Airtel, the telecom provider which is stepping into it by making it available through 21, 000 retail locations. The new entrant will flaunt the digital TV with MPEG4 standard and DVB S2 technology.Commenting on exploring the new sphere Manoj Kohli, CEO at Bharti Airtel (News - Alert) says, "Today we are starting a new chapter in our journey, one that adds a new dimension to our existing product portfolio and is a major step towards transforming Airtel from just a telecom brand to a lifestyle enabler. The launch of Airtel digital TV is the culmination of our three screens strategy, which is to be present across mobile phone, computer and TV screens."
There shall be Airtel relationship centers in 62 cities across India for delivery of the service which comes with a 20 percent larger dish antenna for better performance during rain. Moreover, the telecom provider will also include a universal remote for both set top boxes and TVs and it will also come with low battery indicator, a search interface and an on-screen account meter. The DTH service also brings with it applications like iMatinee (book cinema tickets), iTravel (browse and book travel packages), iShop (shop on TV for your favorite brands), iCity (get your city's information) and Widgets (update yourself on latest stock news). Apart from Airtel even Videocon is set to enter the market by this month.

Google unveils $4.4 Trillion Clean power by 2030 plan

- Source: SiliconIndia Newsletter, Date: Friday, October 03, 2008
Washington: Search engine giant Google has unveiled a $4.4 trillion plan dubbed Clean Power by 2030 that calls for all energy in the US to come from renewable sources. The web giant in a release posted on its site said: "While this plan will cost $4.4 trillion (in undiscounted 2008 dollars), it will ultimately save $5.4 trillion, delivering a net savings of $1 trillion over the life of the plan".
The three basic elements of the clean energy plan are new transmission lines and policies like national renewable portfolio standard, new generation vehicles running on non-oil fuels and greater energy efficiency by installing smart meters and real time pricing. Under the new renewable energy plan, wind power is envisioned to generate 380 gigawatt (gw) and solar power would provide 250 gw. Geothermal source of energy would produce 80 gw and is expected to take a greater role as technology gains in maturity.A gigawatt is equal to one billion watts. This unit is sometimes used with large power plants or power grids.Google also calls for more than 32,000 km of new transmission lines to support renewable energy generation.
The web giant visualises 22 million plug-in vehicles by 2030 that would make up half the total estimated vehicles on American roads.
Vehicles of traditional technology need to improve fuel efficiency at 72 km per gallon by 2030. "We should offer incentives to get older inefficient vehicles off the roads," it says."When homes are equipped with smart meters and real-time pricing, research shows that energy use typically drops. Google is looking at ways that we can use our information technology and our reach to help increase awareness and bring better, real-time information to consumers," the website said."Energy efficiency is the area where Google has been the least vocal, but could potentially offer the most support, by providing a lot of important communications data," the release said.

Green economy can create nine lakh jobs in India

- Source: Silicon India Newsletter, Date: Monday, October 06, 2008
New Delhi:The global market for environmental products and services is projected to double from $1.37 trillion per year at present to $2.74 trillion by 2020, creating millions of new "green jobs", says a recent report commissioned by a number of UN organisations.India can generate 900,000 jobs by 2025 in the area of biogas alone. Of these, 300,000 would be in the manufacturing of stoves and 600,000 in areas such as processing into briquettes and pellets and the fuel supply chain, says the report.
Called 'Green Jobs: Towards Decent work in a Sustainable, Low-Carbon World', the report was commissioned by the UN Environment Programme (UNEP) under a joint Green Jobs Initiative with the International Labour Office (ILO), the International Trade Union Confederation (ITUC) and the International Organization of Employers (IOE). The Worldwatch Institute produced it with technical assistance from the Cornell University Global Labour Institute.However, the report also finds that the process of climate change, already underway, will continue to have negative effects on workers and their families, especially those whose livelihoods depend on agriculture and tourism.The report says too few green jobs are being created for the most vulnerable: the 1.3 billion working poor (43 percent of the global workforce) in the world with earnings too low to lift them and their dependants above the poverty threshold of $2 per person, per day, or for the estimated 500 million youth who will be seeking work over the next 10 years.The authors of the report say that climate change, adaptation to it and efforts to arrest it by reducing emissions have far-reaching implications for economic and social development, for production and consumption patterns and thus for employment, incomes and poverty reduction.
Other key findings of the report include:
* The global market for environmental products and services is projected to double from $1.37 trillion per year at present to $2.74 trillion by 2020.
* Half of this market is in energy efficiency and the balance in sustainable transport, water supply, sanitation and waste management. In Germany, for example, environmental technology is to grow fourfold to 16 percent of industrial output by 2030, with employment in this sector surpassing that in the country's big machine tool and automotive industries.
* Sectors that will be particularly important in terms of their environmental, economic and employment impact are energy supply, in particular renewable energy, buildings and construction, transportation, basic industries, agriculture and forestry.
* Clean technologies are already the third largest sector for venture capital after information and biotechnology in the US, while green venture capital in China more than doubled to 19 percent of total investment in recent years.
* 2.3 million people have in recent years found new jobs in the renewable energy sector alone, and the potential for job growth in the sector is huge. Employment in alternative energies may rise to 2.1 million in wind and 6.3 million in solar power by 2030.
* Renewable energy generates more jobs than employment in fossil fuels. Projected investments of $630 billion by 2030 would translate into at least 20 million additional jobs in the renewable energy sector.
* In agriculture, 12 million could be employed in biomass for energy and related industries. In a country like Venezuela, an ethanol blend of 10 percent in fuels might provide one million jobs in the sugarcane sector by 2012.
* A worldwide transition to energy-efficient buildings would create millions of jobs, as well as "greening" existing employment for many of the estimated 111 million people already working in the construction sector.

World's biggest computing grid set to process data from Large Hadron Collider

- Source: SiliconIndia Newsletter, Date: Monday, October 06, 2008
Washington: The world's largest computing grid is all set to tackle the biggest ever data challenge from the most powerful accelerator, the Large Hadron Collider (LHC).Three weeks after the first particle beams were injected into the LHC, the Worldwide LHC Computing Grid combines the power of more than 140 computer centres from 33 countries to analyse and manage more than 15 million gigabytes of LHC data every year.
A gigabyte has 1,024 megabytes of information and a single megabyte has one million bytes of information.The US is a vital partner in the development and operation of the WLCG. Fifteen universities and three US Department of Energy (DOE) national laboratories from 11 states are contributing their power to the project."The US has been an essential partner in the development of the vast distributed computing system that will allow 7,000 scientists around the world to analyse LHC data, complementing its crucial contributions to the construction of the LHC," said Glen Crawford of the High Energy Physics programme in DOE's Office of Science.DOE and the National Science Foundation (NSF) support contributions to the LHC and to the computing and networking infrastructures that are an integral part of the project, according to a joint press release issued by the US Department of Energy's Brookhaven National Lab and Fermi National Accelerator Centre.
US contributions to the Worldwide LHC Computing Grid are coordinated through the Open Science Grid or OSG, a national computing infrastructure for science. The OSG not only contributes computing power for LHC data needs, but also for projects in many other scientific fields including biology, nanotechnology, medicine and climate science.
"Particle physics projects such as the LHC have been a driving force for the development of worldwide computing grids," said Ed Seidel, director of the NSF Office of Cyberinfrastructure. "The benefits from these grids are now being reaped in areas as diverse as mathematical modelling and drug discovery.""Open Science Grid members have put an incredible amount of time and effort in developing a nationwide computing system that is already at work supporting America's 1,200 LHC physicists and their colleagues from other sciences," said OSG executive director Ruth Pordes from DOE's Fermi National Accelerator Lab.Dedicated optical fibre networks distribute LHC data from CERN in Geneva, Switzerland to 11 major 'Tier-1' computer centres in Europe, North America and Asia, including those at DOE's Brookhaven National Lab in New York and Fermi National Accelerator Laboratory in Illinois. From these, data is dispatched to more than 140 "Tier-2" centres around the world, including 12 in the US."Our ability to manage data at this scale is the product of several years of intense testing," said Ian Bird, leader of the Worldwide LHC Computing Grid project.
"Today's result demonstrates the excellent and successful collaboration we have enjoyed with countries all over the world. Without these international partnerships, such an achievement would be impossible," he said.
"When the LHC starts running at full speed, it will produce enough data to fill about six CDs per second," said Michael Ernst, director of Brookhaven National Laboratory's Tier-1 Computing Centre."As the first point of contact for LHC data in the US, the computing centres at Brookhaven and Fermilab are responsible for storing and distributing a great amount of this data for use by scientists around the country. We've spent years ramping up to this point, and now, we're excited to help uncover some of the numerous secrets nature is still hiding from us," informed Ernst.Physicists in the US and around the world will sift through the LHC data torrent in search of tiny signals that will lead to discoveries about the nature of the physical universe. Through their distributed computing infrastructures, these physicists also help other scientific researchers increase their use of computing and storage for broader discovery."Grid computing allows university research groups at home and abroad to fully participate in the LHC project while fostering positive collaboration across different scientific departments on many campuses," said Ken Bloom from the University of Nebraska-Lincoln, manager for seven Tier-2 sites in the US.