Abstracts 2006


Atmel

Atmel is a large semiconductor manufacturer with a worldwide distribution of products. The company is currently undergoing a company wide attempt to increase efficiency. As part of this effort, it has asked this team to write a user friendly replacement for its photolithography program. This new tool will be a web browser based application and include functionality not available in the current program. The core features will include the following:
Time permitting, the team will add functionality for the following:
1)Graphing capability using the historical data logged in the database.
2)Database support for the current photolithography program. This will allow both the web based program and the older telnet program to be used simultaneously.

  Avaya

Avaya has data relating to Root Cause Analysis (RCA) for various projects. They lack a sufficient way to update and monitor this information whenever an error is encountered. We will implement a web-based solution to this problem.

This application will make use of Java Servlets and MySQL. The interface will be a website and will access a database. The database will consist of a Problem Record, Symptom Data, and Root Data. Each of these tables will hold RCA information pertaining to all of the projects.

The website will allow for three types of users: Administrators, RCA team members, and developers. Each of these users will have separate privileges. The administrators will have the ability to view and modify privileges on all the data, as well as modify privileges for the databases and users. The RCA team members will have the ability to view and modify privileges for the symptom data and root data. The developers have limited modification privileges on the symptom data and viewing privileges on the root data.

BLM

The Bureau of Land Management(BLM) uses the Land and Realty Authorization Module(LRAM) as an internal database program. The LRAM is an intranet module that allows access to the database with which the authorization and billing for Right of Way(ROW) land rentals is located.

The user provides appropriate search criteria and the corresponding report is generated. The user is capable of making modifications, printing, and saving the report. The database query is done using Brio Query or Hyperion Designer and uses an ODBC connection to the Data Warehouse. If time permits, this data will be made viewable via intranet using Macromedia(Adobe) Dreamweaver.

Cherry Creek Physical Science

Ethan Dusto is a teacher at Cherry Creek High School who tasked our team to design and create an interactive web page for his Physical Science class. The text for the freshman level course is recommended for students in grades 12+, so Mr. Dusto would like a web site that provides the students with material that will supplement his class lectures. The main focus for the web site will be to create interactive units using Java applets so that the students can get a firm understanding of the more complex material. There will also be interactive quizzes and tests created using HTML and JavaScript so that the students will be able to test their knowledge of the course material prior to upcoming quizzes and exams in the classroom. Time allowing, the web site will also contain links to chapter summaries, keywords and a brief question and answer sections for each topic being discussed. Lastly, Mr. Dusto will be given a template for the website, so that he may update or change the website as needed, or even create a new website for other classes that he teaches.

CSM 1 – Technology Camp

CSM 2 – Quantum Tunneling

We use a multi-band Bose-Hubbard model to study the quantum dynamics of ultracold bosons in a tilted two-well potential. We describe the energy eigenstates in detail. These consist of oscillator-like solutions and entangled states, including extreme cat states. The ground state ranges from a single number state to a coherent, or superfluid, state. We also describe the dynamics of a Bose-Einstein condensate (BEC) initially localized in one well.

We show how the oscillation frequency depends on the hopping strength, interaction potential, tilt, and the number of particles in the system. In the presence of high barriers the oscillation becomes exponentially slower as the number of particles increases. We find modulated oscillation in the presence of low barriers. Furthermore, we characterize the destruction and revival of entangled stationary states and, consequently, tunneling in the presence of high barriers.

We are currently expanding the model to allow particles to occupy excited states in order to study the macroscopic quantum tunneling of node-like structures, or dark solitons, through a potential barrier. Eventually we will include a third degenerate band to study the behavior of vortex-like structures in a tilted two-well potential.

CSM 3 - Prime Research

Dr. Jason Liu and Alex Probst from PRIME research group have presented a project for creating a Graphical User Interface (GUI) for the SSFNet simulator. The simulator involves creating and allowing real distributed applications to interact with the virtual network environment. Requirements for the project include the following: read and write Data Modeling Language, graphical representation of the network, and the ability to modify attributes in network layout. Well written documentation, portability, with maintainable and expandable characteristics will lead to project success. Our goal is to complete and deliver the tools and documentation to the client within the given time frame.

  CSM 4 – Geophysics

The Geophysics department at the Colorado School of Mines is developing a Java toolkit that can be used in programming scientific visual applications. Specifically, it provides graphics tools that allow display and interactive manipulation of 3-D surfaces. This is important to scientific applications, particularly in the field of geophysics, because the ability to interact with surfaces and manipulate joints and faults can greatly increase understanding of a material, such as the Earth's subsurface. Current applications do not provide good interactive models, and for this reason Mines is developing tools that will assist in development of better applications.

Our task is to design and implement methods for creating and displaying 3D triangulated surfaces. We will be using graphics tools from the OpenGL library that have been wrapped in Java, along with advanced data structures such as scene graphs. Several of the problems we will need to solve involve selection of optimal divisions in a bounding sphere hierarchy, as well as ways to interact with 3D surfaces using a 2D mouse.

CSM 5 – Chemistry #1

A protein is a complex, high molecular weight organic compound that consists of amino acids joined by peptide bonds. Protein sequencing refers to the task of ordering the amino acids in a protein of interest. The amino acid sequencing has traditionally been determined for unknown proteins through a complex process of enzymatic digestion and mass spectrometric analysis. As this is a very time-consuming process, Voorhees et. al. are attempting to modify the process by replacing the enzymatic digestion with a pyrolytic breakdown procedure.

The doctoral thesis of Meetani [1] has demonstrated that proteins can be thermally degraded by pyrolysis in a selective and reproducible process. With a known protein structure, he was able to reconstruct the sequence of the original protein based on the mass spectrometry data of the thermal fragments. Unlike enzymatic digestion, where a protein is completely cleaved at well-known and predictable sites, the pyrolysis procedure yields a much more complex fragment solution. The corresponding spectrum contains information corresponding to a wide range of protein fragments in both cyclic and straight chain conformations.

The client, Dr. Kent Voorhees, has requested the design of a method to reconstruct an original and unknown protein from the spectra of the original and pyrolyzed solutions. The spectra to be used in the development of this project were obtained via MALDI-TOF-MS (Matrix assisted Laser Desorption Ionisation Time-of-Flight Mass Spectrometry) in the course of the doctoral research conducted by Meetani [1]. The following protein spectra have been supplied by the client in both digital and hardcopy format: melittin, insulin, insulin chain B, insulin chain B (fragment 22-30), and methionine enkephalin-Arg-Phe.

1. Meetani, Mohammed A. 2003. Bacterial Proteins Analysis Using Mass Spectrometry. Golden, Colorado:Colorado School of Mines. (Applied Chemistry Thesis) pp.79-119

CSM 6 – Chemistry #2

Chemistry Solver ’06 is an application that will allow the user to input an unbalanced chemical equation and output the correct balanced equation, if one or more equations are available. Our group has been asked to convert a legacy program originally designed to perform this task that is not fully functional. The new application will be web-based, instead of an executable application. In order to balance the chemical equations, this application will need to be able to count the charges of reactants and products, and deal with them appropriately. It must also have a data store to evaluate the different chemical elements.

This task will be completed using Java Server Pages (JSP) and Servlets, a data store that will contain the chemical elements and properties, and an intuitive user interface. The goal of this project is to create a functional web application, which will allow chemistry enthusiasts from around the world to quickly balance complex chemical equations.

CSM 7 – Analysis of Aerosol Surface Data

The variability in aerosol detection produces problems with detecting climate variability, and climate trends around the world. The measurement of aerosols has a history of variability at a factor of two or greater. Aerosols are particle matter that are suspended in the atmosphere and are not supposed to be there. Aerosols are defined to be less than 10 micrometers in diameter. These include CFCs, carbon-oxides and nitrogen-oxides. The three general ways of obtaining aerosols concentrations is direct air samples taken from a plane, remote sensing samples from a satellite and from the ground (direct and remote sensing samples). We will be concentrating on the measurements taken from the ground. We can retrieve the data for both locations, Bondville, IL and Southern Great Plains, OK, via the internet. We will have to distinguish the variability of the measurement techniques, the variability in different times, and the variability in space.

We will have to generate a model that will identify and quantify the uncertainties. These uncertainties can be seen in daily patterns, seasonal patterns, spatial patters and weather patterns. In this project our goals are as follows:
1)characterize the ground level data
2)describe the ground level variability in the data
3)statistically analyze the models

These goals will be for sites with a long history of measuring ground aerosol data. For the given project we will also be learning new statistical tests in theory and in practice with ‘R’. We have also decided to learn another program designed to display mathematical reports called LaTeX. Our research project is joint work with Dr. Reinhard Furrer.

CSM 8 – Parametric Model for Oil Production

The world's dependence on fossil fuels has led to much debate regarding the remaining extractable crude oil. Many argue that the annual production of oil has declined in the past years and will continue to decay. A well-known method of approximating the growth, peak, and decline of finite, nonrenewable resources, such as oil, is the Hubbert model created in 1956. Other alternative methods have been suggested since, based on non-linear statistical models.

The team will attempt to approximate the time, quantity, and nature of decline of the world's peak crude oil production given specific data sets. In order to determine this information, we will attempt to model the data with analytical, non-linear curves. We will develop parametric families of functions that describe our curves and eventually simulate oil production for a theoretical world. To accomplish these tasks, we will be using the programming environments of R and LaTeX.

Equizitive

eQUIZitive's primary product, medQ, is a web based application used by hospitals and medical staffing companies to keep track of employee's documents and certifications. Currently a majority of these documents are transmitted via FAX. While this method is fast and reliable, document quality can be poor and at times unreadable. Our project is to design a windows program through which users can scan documents and transmit them to the server via the internet with as much ease of use as the current FAX system. This program must be easy to use with a simple yet robust user interface that allows additional information about each document to be entered prior to sending the documents to the server.

Medtronic

Medtronic Navigation is a company that specializes in adapting advanced localization techniques, used by doctors for more precise, less invasive surgeries. To that end, they have requested that our team develop software for use in testing an infrared camera based localization application.

Initially our software will record actual data sent from a camera to their application and be able to play this back in order to simulate normal operation. The software can also be used to stress their application, both by introducing faulty data and through performance testing. The final aspect is to synthesize normal and fault conditions of the camera in order to streamline testing of their application over a wider spectrum of behaviors.

Toilers # 1

The study and implementation of wireless sensor networks is an emerging field in computer science, with applications in fields from environmental monitoring to national defense. These networks consist of individual sensor nodes which typically collect and transmit data to a central server. Due to the distributed nature of these networks, energy consumption and latency are the primary determinants in network efficiency. Algorithms optimizing these two factors are an area of intensive research and the subject of this project.

The goal of this project is to evaluate one such algorithm using the open source network simulator ns2. LEACH, or Low Energy Adaptive Clustering Hierarchy, consists of a percentage of randomly chosen nodes to act as cluster heads. Once these heads have been determined, they broadcast a signal notifying other nodes within range. These nodes can then calculate their optimal cluster head. These heads act as a middle layer between individual nodes and the server, relaying data and optimizing packet size. By definition, the LEACH model ensures that only two hops separate the node from the database. A network consisting of 100 randomly deployed nodes implementing the LEACH model will be used to test programs for node management.

The implementation of this algorithm using ns2 will enable the study of the performance of three different programs for node management: Active-Listening, Active-Sleeping, and Active-Listening-Sleeping. In Active-Listening, the node is either transmitting or listening for an update request from the server. AL results in relatively low latency but high energy consumption. In Active-Sleeping, the node is active only for transmissions triggered by its sensors and toggles to a sleep state upon completion of the transmission. In its sleep state, the node is unable to respond to queries from the server. If an arbitrary time interval T passes without communication with the server, the node is programmed to initiate a source-initiated update regardless of its sensor state. During this update, the node is active and the server can in turn initiate consumer updates as needed. Active-Listening-Sleeping is a hybrid of AS and Al: after a source-initiated update, the node toggles to a listening state for an arbitrary time interval. At the end of this interval, the node toggles to a sleeping state where it behaves as an AS programmed node.

The simulation data obtained by implementing these algorithms using ns2 will depict which is the most efficient in terms of energy consumption and latency. These data will give researchers a better understanding of which models produce particular results, thus facilitating the goal of this project.

Toilers # 2

The Toilers Research Group is a graduate student research group at CSM focusing on wireless sensor networks. They have been developing a piece of software called iNSpect (interactive NS-2 protocol and environment confirmation tool) that graphically maps a set of nodes in a wireless network based on an existing trace file in one of three formats.

As our field session project, we will be adding a few features to the existing software:
1) adding parsing capability for .NAM file format (a common format for this type of trace),
2) allowing users to change the color scheme of the nodes displayed on the map as the simulation is taking place,
3) alter the .viz format to include mobility information, merging elements from a mobility trace and the existing .viz format and
4) fix as many known existing bugs as possible.
If time allows, we may attempt to add an ongoing statistics feature to the program, and alter the data structures and functions to allow unordered read-in of node data.

Toilers # 3

Wireless sensor networks are useful for collecting large amounts of data from an environment. Usually, all of the data are collected in a database and analyzed offline. However, by the time the scientist has a chance to evaluate the data, it is outdated and may not be useful. The goal of this project is to make wireless sensor networks more interactive by allowing scientists to define data events of interest, a small subset of the data stream, and receive notification for only those events online.

To accomplish this task, we will write software for the wireless sensors and the data collection server. Our sensors in BB 154 can collect information on light, temperature, and humidity. The software will define when a sensor should send back data and possibly what data to send. The application on the server will decide when events occur, such as considerable changes in temperature or light readings, and publish an RSS feed for the client. Scientists will be able to define events and subscribe to the RSS notifications through a web interface.

The final deliverable will be a demo which involves walking into a sensor network with a laptop, specifying interesting events, and then monitoring those events in real time. This demo will be submitted to ACM SenSys 2006, one of the top conferences in sensor networks. The software for the sensors will be written in nesC as a TinyOS component, meaning that any TinyOS application can easily include it and utilize our application. The server application will be written in Python due to portability, availability of preexisting modules, and rapid development time.