Name

Abstract

Joe Paradiso, Media Lab

Evolution of Sensors

Sensor networks are evolving and converging to form a ubiquitous electronic "nervous system" that will soon extend across things, places, and people. Similarly, online virtual worlds are rapidly evolving - rich, shared virtual environments now exist that are populated by at least thousands of simultaneous visitors. The place where these two areas intersect - which we term "cross reality" - presents an opportunity for converging machine and human perception, while blurring conventional notions of "presence." Cross Reality environments promise to present a natural way to blend built and natural environments, funneling information between the real and natural world, and manifesting it in various ways to emphasize different events, phenomena, and interaction. This talk will outline experiments done at the MIT Media Lab with Cross Reality, and also illustrate various frontiers of sensing technology that we are currently exploring for sensing human activity that can be relevant to addressing challenges in sensing nature.

(Tue 9:00am)

Yogesh Gianchandani, U-Mich/WIMS

Microsystems for Environmental Monitoring

Environmental monitoring applications have emerged as one of the important driving forces for research in MEMS and microsystems technology. Initial successes have been in the physical domain, with micromachined pressure sensors and inertial sensors now in widespread use. NSF and other organizations are now supporting academic research efforts for a wide range of environmental sensors. These include, for example, sensors for pressure/temperature/humidity; gases and vapors; water quality; air-borne and water-borne pathogens; radiation; etc. Many of these devices consume very small amounts of power, and are intended to be deployed with miniaturized wireless interfaces that are similarly power-efficient. This talk will provide a few examples of efforts underway and will identify some of the challenges that lie ahead.

(Tue 9:20am)

John Burt, U-Washington

Sensing the social network: using wireless tech to monitor animal social behavior

As with humans, animal societies can be viewed as networks of interconnected individuals, linked by social, spatial, temporal, and numerous other relationships. By studying social networks, we can derive unique insights into the workings of the society, better understand the behavioral strategies of individuals, and more accurately model and predict the emergent properties of social behavior. Animal social networks are particularly difficult to monitor, however, because individuals often do their business out of eye/camera/sensor-sight. With circuit miniaturization and ultra-low power design techniques, tools can now be constructed for even small animals that allow their social behavior to be monitored on a continuous basis. The University of Washington Encounternet project (www.encounternet.net) has developed a 1 gram association monitor and sensor tag that can be fitted onto small animals such as songbirds. Animal-worn tags log associations with other tags and operate indefinitely using solar energy harvesting. Tags communicate with a distributed network of fixed position data collection nodes that autonomously download tag data and forward it to a central server via a cellular or wifi link. Encounternet has been designed with flexibility on mind and will be open-source and open-hardware so that it can be adapted by others for many different behavioral monitoring and sensing applications.

(Tue 9:40am)

Ari Daniel Shapiro, MIT

Killer whales underwater and on air: Science in and beyond the tower

Norwegian killer whales feed on herring, and they do it as a team. They round the fish up from depth, bring them to the surface, gather them into a tight ball, tail slap the edge of the ball, and eat the incapacitated fish one by one. I'll discuss what we learned about the biology and behavior of these animals after we attached digital audio and movement tags to the killer whales. Then I'll talk about my transition to my current job as a radio producer and how I work to communicate science topics (including killer whales!) to a larger audience in a way that's informative and engaging.

(Tue 11:00am)

Robert Stevenson, U-Mass

Wireless Sensor Networks in Coastal Environments: Integrating Technologies and Sciences Across Disciplines

Fifty percent of the US population lives within 100 km of the coast and the influx continues as more people migrate to the coastal edge. This concentration of humans significantly impacts coastal zones (e.g. beach erosion, eutrophication, overfishing, habitat alterations, and air pollution). Simultaneously, coastal environments impact human activities (e.g. protection from water erosion and major storms, location of transportation hubs, design of ports,). The Center for Coastal Environmental Sensing Networks (CESN; www.cesn.org) was formed at UMass Boston in 2006 to advance the role of science in managing coastal zone problems being faced by a diverse range of decision-makers and stakeholders including federal, state and municipal governments; coastal communities and property owners; and coastal and upstream agricultural and industrial businesses.

CESN assembles interdisciplinary teams to understand management requirements and to provide solutions by integrating sensors, sensor packages and cyberinfrastructure with scientific analysis. We illustrate this approach with examples from development of 1) a tidal gauge sensor to monitor storm surge and saltmarsh restoration, 2) a “swan” sensor package for monitoring ponds for middle school children, 3) a sensor network testbed in the Neponset River estuary and Boston Harbor and 4) a Web 2.0 application to detect unusual events in coastal environments. Our philosophy is to integrate a wide selection of technologies that best match the cost and technology requirements for the management problem. Currently we are developing mobile sensors to study the thin layer of freshwater in estuaries that can transport large amounts of nutrients and carbon during storm events. Our open source philosophy and real-time modeling approach should make data available to a wide audience.

(Tue 11:20am)

Dezhen Song, TAMU

Collaborative Observation of Natural Enviroments

Scientific study of wildlife requires detailed observations over extended periods of time and can be an arduous, expensive, dangerous, and lonely experience for scientists. Emerging advances in robot cameras, long-range wireless networking, and distributed sensors make feasible a new class of portable robotic observatories that can allow groups of scientists, via the internet, to remotely observe, record, and index detailed animal activity. The acronym used for this is CONE (Collaborative Observation of Natural Environments). I will summarize the four-year development of algorithms, CONE systems, lessons learned, and results of field experiments, which have included application in the renewed search for the Ivory billed woodpecker.

(Tue 1:30pm)

Nikolaus Correll, University of Colorado

Building a Robotic Garden

This talk describes the architecture and implementation of a distributed autonomous gardening system. The
garden is a mesh network of robots and plants. The gardening robots are mobile manipulators with an eye-in-hand camera. They are capable of locating plants in the garden, watering them, and locating and grasping fruit. The plants are potted cherry tomatoes enhanced with sensors and computation to monitor their well-being (e.g. soil humidity, state of fruits) and with networking to communicate servicing requests to the robots. Task allocation, sensing and manipulation are distributed in the system and de-centrally coordinated. We describe the architecture of this system and present experimental results for navigation, object recognition and manipulation

(Tue 1:50pm)

Greg Marshall, National Geographic

Crittercam: Can You See Me Now?

“Crittercam” is an animal-borne imaging and data logging tool that was invented to explore and study cryptic behavior of marine and terrestrial species. For more than twenty years researchers have used this remote imaging concept in studies of more than 60 species (from emperor penguins to blue whales) and have documented unique and sometimes extraordinary natural events from the perspective of wild, free-ranging animals. The conservation impacts of some of the discoveries we’ve made have been significant. For all that, we’ve barely scratched the surface of the potential of this emerging field of research. Virtually every Crittercam deployment raises as many questions as it solves. Many aspects of the behavior and ecology of little known species remain mysterious due to a lack of tools for unobtrusive direct observation. I will explore the opportunities and challenges facing this growing field in the context of past, present and future technologies, research and conservation imperatives, as well as the encroaching ethical dilemma of capturing a wild point of view of human-environment mobile based interactions from the back of charismatic megafauna.

(Tue 2:10pm)

Judy Cushing, Evergreen State College

VISTAS

During the past 15 years, Cushing’s Scientific Database Lab at Evergreen has been developing informatics tools for canopy researchers (http://canopy.evergreen.edu/cdb). For example, DataBank uses domain-specific database components to help scientists design and populate their own databases (http://canopy.evergreen.edu/databank); and the StudyCenter contains DataBank databases (http://canopy.evergreen.edu/studycenter). To help motivate the use of databases (vs. spreadsheet or flat file), her team implemented CanopyView, which takes a DataBank database and, using the Visualization Toolkit (VTK: Kitware Inc. http://www.kitware.com), builds 2- and 3-D visualizations according to the visualizations that can be composed using that particular dataset (http://canopy.evergreen.edu/canopyview). As Cushing’s lab became known among local ecologists for visualization, her team was asked to do visualizations the ecologists could not build on their own, and the VISualization of Terrestrial-Aquatic Systems (VISTAS) project was born. The VISTAS project is currently working on visual analytics research and development for environmental science to serve environmental science collaborators (e.g., modelers, field forest ecologists, hydrologists, atmospheric scientists) who need tools that focus on ecological processes and easily integrate complex topography with landscape data, remotely sensed data (e.g., LiDAR) and analytics. http://acdrupal.evergreen.edu/scidb/visualization has some preliminary visualizations. This work has been funded by the National Science Foundation, most recently: DBI-0417311, IIS-0639588, and -0917708.

(Tue 3:00pm)

Carlos J. Corrada-Bravo, U-Puerto Rico

Towards an Automated Remote Biodiversity Monitoring Network

There is an urgent need to increase the temporal and spatial coverage of ecological data collection in response to the myriad of anthropogenic threats (e.g. extinction crisis, global warming) to global biodiversity, but fortunately the technology exists to do it. During the last two years, we have developed the Automated Remote Biodiversity Monitoring Network (ARBIMON), and we have demonstrated how inexpensive monitoring stations can collect a continuous stream of biodiversity data (acoustics and photographs). Although much of the process is automated, and for many species it will be possible to automate species identification, a major limitation is that most end users do not have access to the necessary tools to manage and analyze these large data sets.

To address this need, we have developed applications using Machine Learning techniques to automate species detection. Specifically, a feedback web application allows users to visualize the results of the initial algorithm, select correct responses, incorporate the correct responses into the training data to improve the model, and then reanalyze the data. These tools and the iterative process allow the user to quickly build accurate species identification algorithms. It is important to note that after a model has been approved, the system will be able to evaluate recordings without any human intervention. Currently we are in the testing stage creating algorithms for birds, amphibians, and monkeys using recordings from our stations in Sabana Seca, Puerto Rico and La Selva, Costa Rica.

(Tue 3:20pm)

Rich Fletcher, MIT/Media Lab

Scalable, Low-Cost Instrumentation for Environmental Mapping

In developing countries as well as the developed world, there is a need to create simple tools to record and monitor environmental data. The primary challenge in these cases is not technical complexity but rather scalability, cost, and power consumption. I will present a hardware sensor platform and associated web site that we have created that enables people to easily collect and map their own sensor data. The sensor box also contains an integrated bluetooth radio that enables connectivity and interaction with mobile phones without the need for any custom software loaded onto the phone.

(Tue 3:40pm)

Glorianna Davenport, MIT/Media Lab

Evolving Nature Stories

Today, as we seek to develop a more holistic view of change to our natural ecosystems, wild and conserved open-space projects offer a precious resource. In addition to encouraging active life styles and providing needed contemplative experiences, these open lands instrumented with appropriate technologies can serve as living laboratories allowing scientists, conservationists, educators and the general public to monitor slow stories of natural transformation more completely and more profoundly than ever before. In this talk, we will explore how we might deploy mobile technologies, sensors and the internet on a large property in Eastern Massachusetts to create a long-term, accessible terrestrial observatory as we transition the property from an active cranberry farm to a wilder wetland and upland habitat. In creating this observatory, we seek to balance access and participation by scientists, educators and the general public.

(Wed 9:00am)

Mike Saunders, Kew/Royal Botanic Gardens

The Anatomy Of A Plant Indentification Tool For Mobile Devices

Obtaining the correct scientific (latin ‘binominal’) name for a plant is essential in accessing any further information about the species. Traditionally, identification is achieved through a specialist process of visually identifying plant features or increasingly, DNA barcoding. These methods require highly specialised knowledge or equipment, and are hence often inaccessible to non-specialilsts. Using a combination of social networks, plant characteristics, GPS and image recognition software, an approach to plant identification that could be offered via mobile devices to the whole population is being developed.

(Wed 9:20am)

Stephen Brown, Manomet

Making Arctic Shorebirds and their Habitats Virtually Accessible to the Public

As shorebirds continue to decline worldwide, it is urgent that we determine why, and put in place hemispheric scale conservation measures while populations are still viable. Building interaction with the public is critical to developing support for the necessary conservation measures. Conservation of species before they are severely threatened with extinction makes sense for society, because costs are much lower and because the health of the ecosystems on which they depend is almost always important for other economic purposes in addition to supporting wildlife populations. Over the past four years we have completed the first survey ever conducted of the Arctic National Wildlife Refuge coastline, where thousands of shorebirds of 14 different species congregate to prepare for their southbound migrations. When we began virtually nothing was known about these remote coastal habitats. We recently returned from our final year travelling by small inflatable boat to the inaccessible coastal mudflats to collect survey data. Because the important areas are extremely remote, and conditions in the arctic are harsh, most people will never experience these habitats themselves, so finding ways to make them accessible remotely is critical. We use a blog featuring podcasts via satellite phone, embedded photos and video, and a Spot Adventures Global Positioning System web-connected device to report our team’s location in real time. We also use a web-based citizen science portal to collect information about populations of shorebirds as they migrate through the continental U.S., and to further involve the public in their conservation.

(Wed 9:40am)

John Pickering, Discover Life

Discover Life

(Wed 10:00am)

Dale Joachim, MIT/Media Lab

Interactive Cellphone Playback Surveys

The Owl Project illustrates the potential role of mobile-based technology in enhancing existing modes of human-nature interaction. Playback surveys, a longstanding method of assessing species populations, involve broadcasting vocalizations from a specific location in the natural habitat, stimulating vocal and other behavioral responses from resident individuals of that species, which are tallied to create census data. Conventionally, such surveys have been performed on site by biologists or citizen scientists, sometimes requiring them to spend long periods in uncomfortable or unsafe locations. I have explored the use of mobile devices, sensors and the Internet to conduct owl playback surveys, obviating the need for on-site observers and also providing added benefits. First, broadcasts can be adapted, in real time, in response to owl vocalizations. Second, internet users can participate in the survey, facilitating implementation of the survey over a wide geographic area. The phone survey method provides scalability, i.e. ease of increasing the number of participants, as well as flexibility and ease of access to survey results in real time. A mobile phone, connected to the Internet, thus becomes a powerful tool in the hands of citizen scientists, biologists or activists. This presentation will describe the infrastructure created for the Owl Project and further questions that lie ahead.

(Wed 10:50am)

Dave Potter, Unity College

Cell phones, owls, and citizen science at Unity College

Thirteen wildlife, ecology, and conservation law enforcement students at Unity College coordinated with Media Lab to call to and listen for common owls and other wildlife February to April 2009. Students communicated with the Media Lab server through a variety of cell phone systems and providers. Although owls responded infrequently, the Owl Project protocol proved adaptable to citizen science field situations.

(Wed 11:10am)

Eric Klopfer, MIT

Community Science Investigators

Community Science Investigations (CSI) is a collaborative project that uses science and technology to develop solutions for real problems. Student teams will use computer tools (GIS) to answer questions and test hypotheses about their neighborhoods. After identifying an issue in the community, student teams will create mobile computer games based on that issue. We will work together to find ways to make our community better.

(Wed 11:30am)

Fernanda Viegas, IBM

Visualizing text

Visualization is often viewed as a way to unlock the secrets of numeric data. But what about political speeches, novels, and blogs? These texts hold at least as many surprises. On the Many Eyes site, a place for public visualization, we have seen an increasing appetite for collectively analyzing documents.

We present a series of techniques for visualizing and analyzing unstructured text. We also discuss how public events such as the presidential election last year catalyze people's passion for making sense of prose.

(Wed 1:00pm)

Kate Beard-Tisdale, U-Maine

Visualizing and Exploring Events from Sensor Networks

The expanding deployment of sensor systems that capture location, time, and multiple thematic variables raises a need for exploratory spatio-temporal data analysis tools. Geographic information systems (GIS) and time series analysis tools support exploration of spatial and temporal patterns respectively and independently, but tools for the exploration of both dimensions within a single system are relatively rare. This presentation will describe a framework for the visualization and exploration of spatial, temporal, and thematic dimensions of sensor based data. The unit of analysis is an event, a spatio-temporal data type extracted from sensor data. An eventviewer allows exploration of spatial and temporal trends, temporal relationships among events, periodic temporal patterns, the timing of irregularly repeating events, event-event relationships in terms of thematic attributes, and event patterns at different spatial and temporal granularities.

(Wed 1:20pm)

Peggy Agouris, George Mason University

Spatiotemporal Modeling and Monitoring of Dynamic Phenomena in Sensor Networks

As geosensor networks are rapidly emerging as a novel paradigm for data collection, and spatial datasets are becoming increasingly spatiotemporal, we are faced with interesting challenges related to the analysis of this information. In this presentation, we focus primarily on issues related to the modeling and comparison of dynamic phenomena to support the monitoring of emerging (and evolving) emergency situations. We also demonstrate the performance of a a spatio-temporal modeling and analysis methodology based on helix representation using atmospheric emissions as a sample case.

(Wed 1:40pm)

Bruce Walker, Georgia Tech

Sonification, Assistive Technology, and Accessible Aquaria

Mobile devices can enable revolutionary assistive technologies. Multimodal interfaces and sonification can make wayfinding, math, science, zoos, and aquaria accessible to people with vision loss. I will discuss our advances and challenges in all these areas.

(Wed 2:40pm)

Myriel Milicevic, Interaction Design Studios

Neighbourhood Satellites

Neighbourhood Satellites dive into urban enviro-sensing, survey Airconomies and Raise the Rockies.

(Wed 3:00pm)

Natalie Jeremijenko, NYU

Texting Fish, Wrestling Beetles, Translating Birds, and Other Cross-Species Adventures

(Wed 3:20pm)