Main Page
Main Page
 
Keynote Program
 
Keynote 1: Tuesday, September 09, 2003, 09:40-11:00
 
“Millipede” - A Nanotechnology-based Approach to Data Storage
Evangelos Eleftheriou
(PDF Presentation Slides)

Evangelos Eleftheriou

IBM Research, Switzerland

Abstract

Data storage is one of the key elements in information technology. The ever increasing demand for more storage capacity in an ever shrinking form factor as well as the pressure to decrease the price per storage unit in $/Gbyte have been a major driving force for substantial world-wide research and development activities to increase storage densities by various means. For many decades, silicon-based semiconductor memory chips and magnetic hard drives (HDD) have been dominating the data-storage market. So far, both technologies have improved their storage densities by about 100% per year, while reducing the cost per gigabyte. However, the areal densities that today's magnetic recording technologies can achieve will eventually reach a limit imposed by the well-known superparamagnetic effect, which today is conjectured to be on the order of 250 Gbit/square inch for longitudinal recording. Several proposals have been formulated to overcome this limit, for example, the adoption of patterned magnetic media,
where the biggest challenge remains the patterning of the magnetic disk in a cost-effective manner. In the case of semiconductor memories, such as DRAM, SRAM, Flash etc., the challenges are predominately in lithography to define and fabricate sub-100-nm FET gates as well as very thin gate-oxide materials.


Today ultrahigh storage densities of up to 1 Tb/in. or more can be achieved by using local-probe techniques to write, read back, and erase data in very thin polymer films. The thermomechanical scanning-probe-based data-storage concept, internally dubbed "millipede", combines ultrahigh density, small form factor, and high data rates. High data rates are achieved by parallel operation of large 2D arrays with thousands micro/nanomechanical cantilevers/tips that can be batch-fabricated by silicon surface-micromachining techniques. The inherent parallelism, the ultrahigh areal densities and the small form factor may open up new
perspectives and opportunities for application in areas beyond those envisaged today.


About the speaker

Evangelos Eleftheriou received a B.S degree in Electrical Engineering from the University of Patras, Greece, in 1979, and M.Eng. and Ph.D. degrees in Electrical Engineering from Carleton University, Ottawa, Canada, in 1981 and 1985, respectively. He joined the IBM Zurich Research Laboratory in Rüschlikon, Switzerland, in 1986, where he worked on various projects related to wired and wireless communications as well as magnetic recording. He currently manages the advanced storage technologies group at the IBM Zurich Research Laboratory.

His primary research interests lie in the areas of communications and information theory, particularly signal processing and coding for transmission and recording systems. Recently his research activities have expanded into the area of nanotechnology and in particular probe-storage techniques. He holds over 30 patents (granted and pending applications) and was named a Master Inventor at IBM Research in 1999.
He was co-recipient of the 2003 IEEE Communications Society Leonard G. Abraham Prize Paper Award. In January 2002, Dr. Eleftheriou was elected Fellow of the IEEE.

Back to Top
Keynote 2: Wednesday, September 10, 2003, 09:00-10:30
 
Integrating Information for on Demand Computing
Nelson Mattos
(PDF Presentation Slides)

Nelson Mattos

IBM, SVL Laboratory, CA, USA

Abstract

Information integration provides a competitive advantage to businesses and is fundamental to on demand computing. It is a strategic area of investment by software companies today whose goal is to provide a unified view of the data regardless of differences in data format, data location and access interfaces, dynamically manage data placement to match availability, currency and performance requirements, and provide autonomic features that reduce the burden on IT staffs for managing complex data architectures. This paper describes the motivation for integrating information for on demand computing, explains its requirements, and illustrates its value through usage scenarios. As shown in the paper, there is still tremendous amount of research, engineering, and development work needed to make the information integration vision a reality and it is expected that software companies will continue to heavily invest in aggressively pursing the information integration vision.


About the speaker

Nelson Mattos, Ph. D., is one of an elite group of practicing expert technical executives -- "IBM Distinguished Engineers" -- recognized not only for their exceptional engineering and programming in research, development, sales and service, but also for their anticipated significant future contributions to IBM's growth. In his current role, director of information integration at the IBM Silicon Valley Laboratory in San Jose, California, Dr. Mattos is responsible for establishing IBM's leadership position in the emerging information integration market. He collaborates with standards bodies and IBM customers, Business Partners and development teams to help businesses integrate digital information assets and leverage the value of those assets across the enterprise. Capitalizing on his strong research background, Dr. Mattos is responsible for the strategy, marketing, and development for such products as DB2 Information Integrator, DiscoveryLink, Replication, and Relational Connect.

Previously, as manager of advanced business intelligence and database technology, Dr. Mattos was the IBM development manager responsible for such DB2 Universal Database business intelligence solutions as DB2 OLAP Server, DB2 Warehouse Manager (formerly Visual Warehouse ), QMF, Intelligent Miner, Data Replication, DataJoiner, and the West Coast development of DB2 UDB on Unix and Windows. He also managed several groups at the IBM Database Technology Institute. Additionally, Dr. Mattos served not only as chief architect for DB2's object-relational technologies, but also as leader of the development of various DB2 Extender products that exploit the object-relational features of DB2. And he was until recently responsible for IBM's participation at different standards forums, including the ANSI SQL committee, the International Organization for Standardization (ISO) Committee for database, W3C, OMG, and SQLJ. In this capacity, he contributed extensively to the design of SQL99 through more than 300 accepted proposals.

Prior to joining IBM, Dr. Mattos was an associate professor at the University of Kaiserslautern in Germany, where he was involved in research on object-oriented and knowledge base management systems and received a Ph. D in computer science. He also holds bachelor of science and master of science degrees from the Federal University of Rio Grande do Sul in Brazil. Dr. Mattos is fluent in four languages, has published over 75 papers on data base management and related topics in various magazines and conferences, and is the author of the book, An Approach to Knowledge Base Management.

Back to Top
Keynote 3: Friday, September 12, 2003, 09:00-10:00
 
The Data-Centric Revolution in Networking
Scott Shenker
(PDF Presentation Slides)

Scott Shenker

ICSI, UC Berkeley, CA, USA

Abstract

Historically, there has been little overlap between the database and networking research communities; they operate on very different levels and focus on very different issues. While this strict separation of concerns has lasted for many years, in this talk I will argue that the gap has recently narrowed to the point where the two fields now have much to say to each other.

Networking research has traditionally focused on enabling communication between network hosts. This research program has produced a myriad of specific algorithms and protocols to solve such problems as error recovery, congestion control, routing, multicast and quality-of-service. It has also led to a set of general architectural principles, such as fate sharing and the end-to-end principle, that provide widely applicable guidelines for allocating functionality among network entities.

This research and design paradigm has been exclusively host-centric; hosts are assumed to know which other hosts (or multicast groups) to contact, and the research focuses on making the resulting host-host communication robust and efficient. However, an increasing number of applications involve accessing particular data objects whose location can't easily be determined within the current Internet architecture. Networking researchers have consequently begun looking at a variety of approaches that are more data-centric than host-centric, in that the basic abstractions refer to the name of the data rather than its location.

This data-centric trend is most visible in two areas of networking research: sensornets and distributed hash tables. Data-centrism is natural for sensornets because the identity of individual nodes is far less important than the data they collect. Traditional networking has toyed with data-centrism in various limited forms (e.g., web redirection, intentional naming), but the recent advent of distributed hash tables has led to a much broader and more explicit engagement with the data-centric paradigm. In both the sensornet and traditional Internet cases, data-centric research initially focused on how to efficiently access data based on logical names. More recent research has used distributed data structures to support more general queries.

Networking researchers have thus unwittingly wandered into the Coddian world of physical data independence, a territory far more familiar to database researchers. This talk will describe our journey to your land.


About the speaker

Scott Shenker received his degrees, in theoretical physics, from Brown University (Sc. B.) and the University of Chicago (Ph. D.). After a postdoctoral year at Cornell's physics department in 1983, he joined Xerox's Palo Alto Research Center. He left PARC in 1999 to head up a newly established Internet research group at the International Computer Science Institute (ICSI) in Berkeley. Scott's research attention has wandered over the years, ranging from computer performance modeling and computer networks to game theory and economics. Most of his recent work has focused on the Internet architecture and related issues.

Back to Top