今日推荐

2006 - MIR '06

Evaluation campaigns and TRECVid

The TREC Video Retrieval Evaluation (TRECVid)is an international benchmarking activity to encourage research in video information retrieval by providing a large test collection, uniform scoring procedures, and a forum for organizations 1 interested in comparing their results. TRECVid completed its fifth annual cycle at the end of 2005 and in 2006 TRECVid will involve almost 70 research organizations, universities and other consortia. Throughout its existence, TRECVid has benchmarked both interactive and automatic/manual searching for shots from within a video corpus,automatic detection of a variety of semantic and low-level video features, shot boundary detection and the detection of story boundaries in broadcast TV news. This paper will give an introduction to information retrieval (IR) evaluation from both a user and a system perspective, high-lighting that system evaluation is by far the most prevalent type of evaluation carried out. We also include a summary of TRECVid as an example of a system evaluation bench-marking campaign and this allows us to discuss whether such campaigns are a good thing or a bad thing. There are arguments for and against these campaigns and we present some of them in the paper concluding that on balance they have had a very positive impact on research progress.

2005 - SenSys '05

Data collection, storage, and retrieval with an underwater sensor network

In this paper we present a novel platform for underwater sensor networks to be used for long-term monitoring of coral reefs and fisheries. The sensor network consists of static and mobile underwater sensor nodes. The nodes communicate point-to-point using a novel high-speed optical communication system integrated into the TinyOS stack, and they broadcast using an acoustic protocol integrated in the TinyOS stack. The nodes have a variety of sensing capabilities, including cameras, water temperature, and pressure. The mobile nodes can locate and hover above the static nodes for data muling, and they can perform network maintenance functions such as deployment, relocation, and recovery. In this paper we describe the hardware and software architecture of this underwater sensor network. We then describe the optical and acoustic networking protocols and present experimental networking and data collected in a pool, in rivers, and in the ocean. Finally, we describe our experiments with mobility for data muling in this network.

2014

Recurrence in Ergodic Theory and Combinatorial Number Theory

Topological dynamics and ergodic theory usually have been treated independently. H. Furstenberg, instead, develops the common ground between them by applying the modern theory of dynamical systems to combinatories and number theory.Originally published in 1981.The Princeton Legacy Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These paperback editions preserve the original texts of these important books while presenting them in durable paperback editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905.

2003 - Ecological Economics

Random effects analysis

This paper explores relationships among environmental attitudes, nonuse values for endangered species, and underlying motivations for contingent valuation (CV) responses. The approach combines techniques from the attitude–behavior and economic valuation literature. Attitudes are measured with the New Ecological Paradigm (NEP) scale, and economic values are derived from a referendum, CV survey for peregrine falcons and shortnose sturgeons. Respondents with stronger pro-environmental attitudes are found more likely to provide legitimate yes/no responses, while those with weaker attitudes are more likely to protest hypothetical CV scenarios. Analysis reveals environmental attitudes as a significant explanatory variable of yes/no responses, whereby stronger pro-environmental attitudes result in higher probabilities of responding ‘yes’. Pro-environmental attitudes are also shown to result in higher estimates of mean willingness to pay (WTP). Significant relationships are found between environmental attitudes and nonuse motivations. Specifically, pro-environmental attitudes are associated with stronger reliance on ethical motives for species protection. These results are discussed as they relate to testing predictions in the literature about potential bias in CV studies and to supporting National Oceanic and Atmospheric Administration (NOAA) recommendations for improving CV reliability.

2005 - ACM Trans. Program. Lang. Syst.

Combinators for bi-directional tree transformations: a linguistic approach to the view update problem

We propose a novel approach to the view-update problem for tree-structured data: a domain-specific programming language in which all expressions denote bidirectional transformations on trees. In one direction, these transformations---dubbed lenses---map a concrete tree into a simplified abstract view; in the other, they map a modified abstract view, together with the original concrete tree, to a correspondingly modified concrete tree. Our design emphasizes both robustness and ease of use, guaranteeing strong well-behavedness and totality properties for well-typed lenses. We begin by identifying a natural space of well-behaved bidirectional transformations over arbitrary structures, studying definedness and continuity in this setting. We then instantiate this semantic framework in the form of a collection of lens combinators that can be assembled to describe bidirectional transformations on trees. These combinators include familiar constructs from functional programming (composition, mapping, projection, conditionals, recursion) together with some novel primitives for manipulating trees (splitting, pruning, merging, etc.). We illustrate the expressiveness of these combinators by developing a number of bidirectional list-processing transformations as derived forms. An extended example shows how our combinators can be used to define a lens that translates between a native HTML representation of browser bookmarks and a generic abstract bookmark format.

2011 - IEEE Journal on Selected Areas in Communications

Base Station Operation and User Association Mechanisms for Energy-Delay Tradeoffs in Green Cellular Networks

Energy-efficiency, one of the major design goals in wireless cellular networks, has received much attention lately, due to increased awareness of environmental and economic issues for network operators. In this paper, we develop a theoretical framework for BS energy saving that encompasses dynamic BS operation and the related problem of user association together. Specifically, we formulate a total cost minimization that allows for a flexible tradeoff between flow-level performance and energy consumption. For the user association problem, we propose an optimal energy-efficient user association policy and further present a distributed implementation with provable convergence. For the BS operation problem (i.e., BS switching on/off), which is a challenging combinatorial problem, we propose simple greedy-on and greedy-off algorithms that are inspired by the mathematical background of submodularity maximization problem. Moreover, we propose other heuristic algorithms based on the distances between BSs or the utilizations of BSs that do not impose any additional signaling overhead and thus are easy to implement in practice. Extensive simulations under various practical configurations demonstrate that the proposed user association and BS operation algorithms can significantly reduce energy consumption.

2004 - 2004 IEEE International Conference on Communications (IEEE Cat. No.04CH37577)

Homogeneous vs heterogeneous clustered sensor networks: a comparative study

This paper presents a cost based comparative study of homogeneous and heterogeneous clustered sensor networks. We focus on the case where the base station is remotely located and the sensor nodes are not mobile. Since we are concerned with the overall network dimensioning problem, we take into account the manufacturing cost of the hardware as well as the battery energy of the nodes. A homogeneous sensor network consists of identical nodes, while a heterogeneous sensor network consists of two or more types of nodes (organized into hierarchical clusters). We first consider single hop clustered sensor networks (nodes use single hopping to reach the cluster heads). We use LEACH as the representative single hop homogeneous network, and a sensor network with two types of nodes as a representative single hop heterogeneous network. For multihop homogeneous networks (nodes use multihopping to reach the cluster head), we propose and analyze a multihop variant of LEACH that we call M-LEACH. We show that M-LEACH has better energy efficiency than LEACH in many cases. We then compare the cost of multihop clustered sensor networks with M-LEACH as the representative homogeneous network, and a sensor network with two types of nodes (that use in-cluster multi-hopping) as the representative heterogeneous network.

2004 - Computer

WiseNET: an ultralow-power wireless sensor network solution

A wireless sensor network consists of many energy-autonomous microsensors distributed throughout an area of interest. Each node monitors its local environment, locally processing and storing the collected data so that other nodes can use it. To optimize power consumption, the Swiss Center for Electronics and Microtechnology has developed WiseNET, an ultralow-power platform for the implementation of wireless sensor networks that achieves low-power operation through a careful codesign approach. The WiseNET platform uses a codesign approach that combines a dedicated duty-cycled radio with WiseMAC, a low-power media access control protocol, and a complex system-on-chip sensor node to exploit the intimate relationship between MAC-layer performance and radio transceiver parameters. The WiseNET solution consumes about 100 times less power than comparable solutions.

1992 - Water Resources Research

The validity of a simple statistical model for estimating fluvial constituent loads: An Empirical study involving nutrient loads entering Chesapeake Bay

We consider the appropriateness of “rating curves” and other log linear models to estimate the fluvial transport of nutrients. Split-sample studies using data from tributaries to the Chesapeake Bay reveal that a minimum variance unbiased estimator (MVUE), based on a simple log linear model, provides satisfactory load estimates, even in some cases where the model exhibited significant lack of fit. For total nitrogen (TN) the average difference between the MVUE estimates and the observed loads ranges from −8% to + 2% at the four sites. The corresponding range for total phosphorus (TP) is −6% to +5%. None of these differences is statistically significant. The observed variability of the MVUE load estimates for TN and TP, which ranges from 7% to 25% depending on the case, is accurately predicted by statistical theory.

2006 - Asian Journal of Mathematics

A Complete Proof of the Poincaré and Geometrization Conjectures - application of the Hamilton-Perelman theory of the Ricci flow

In this paper, we give a complete proof of the Poincare and the geometrization conjectures. This work depends on the accumulative works of many geometric analysts in the past thirty years. This proof should be considered as the crowning achievement of the Hamilton-Perelman theory of Ricci flow.

2008 - Water Resources Research

Analysis of terrestrial water storage changes from GRACE and GLDAS

Since March 2002, the Gravity Recovery and Climate Experiment (GRACE) has provided first estimates of land water storage variations by monitoring the time-variable component of Earth's gravity field. Here we characterize spatial-temporal variations in terrestrial water storage changes (TWSC) from GRACE and compare them to those simulated with the Global Land Data Assimilation System (GLDAS). Additionally, we use GLDAS simulations to infer how TWSC is partitioned into snow, canopy water and soil water components, and to understand how variations in the hydrologic fluxes act to enhance or dissipate the stores. Results quantify the range of GRACE-derived storage changes during the studied period and place them in the context of seasonal variations in global climate and hydrologic extremes including drought and flood, by impacting land memory processes. The role of the largest continental river basins as major locations for freshwater redistribution is highlighted. GRACE-based storage changes are in good agreement with those obtained from GLDAS simulations. Analysis of GLDAS-simulated TWSC illustrates several key characteristics of spatial and temporal land water storage variations. Global averages of TWSC were partitioned nearly equally between soil moisture and snow water equivalent, while zonal averages of TWSC revealed the importance of soil moisture storage at low latitudes and snow storage at high latitudes. Evapotranspiration plays a key role in dissipating globally averaged terrestrial water storage. Latitudinal averages showed how precipitation dominates TWSC variations in the tropics, evapotranspiration is most effective in the midlatitudes, and snowmelt runoff is a key dissipating flux at high latitudes. Results have implications for monitoring water storage response to climate variability and change, and for constraining land model hydrology simulations.

2008 - TRECVID

The MediaMill TRECVID 2008 Semantic Video Search Engine

In this paper we describe our TRECVID 2008 video retrieval experiments. The MediaMill team participated in three tasks: concept detection, automatic search, and interac- tive search. Rather than continuing to increase the number of concept detectors available for retrieval, our TRECVID 2008 experiments focus on increasing the robustness of a small set of detectors using a bag-of-words approach. To that end, our concept detection experiments emphasize in particular the role of visual sampling, the value of color in- variant features, the influence of codebook construction, and the effectiveness of kernel-based learning parameters. For retrieval, a robust but limited set of concept detectors ne- cessitates the need to rely on as many auxiliary information channels as possible. Therefore, our automatic search ex- periments focus on predicting which information channel to trust given a certain topic, leading to a novel framework for predictive video retrieval. To improve the video retrieval re- sults further, our interactive search experiments investigate the roles of visualizing preview results for a certain browse- dimension and active learning mechanisms that learn to solve complex search topics by analysis from user brows- ing behavior. The 2008 edition of the TRECVID bench- mark has been the most successful MediaMill participation to date, resulting in the top ranking for both concept de- tection and interactive search, and a runner-up ranking for automatic retrieval. Again a lot has been learned during this year’s TRECVID campaign; we highlight the most im- portant lessons at the end of this paper.

1973 - American Political Science Review

Schema Theory: An Information Processing Model of Perception and Cognition

The world is complex, and yet people are able to make some sense out of it. This paper offers an information-processing model to describe this aspect of perception and cognition. The model assumes that a person receives information which is less than perfect in terms of its completeness, its accuracy, and its reliability. The model provides a dynamic description of how a person evaluates this kind of information about a case, how he selects one of his pre-existing patterns (called schemata) with which to interpret the case, and how he uses the interpretation to modify and extend his beliefs about the case. It also describes how this process allows the person to make the internal adjustments which will serve as feedback for the interpretation of future information. A wide variety of evidence from experimental and social psychology is cited to support the decisions which went into constructing the separate parts of the schema theory, and further evidence is cited supporting the theory's system-level predictions. Since the schema theory allows for (but does not assume) the optimization of its parameters, it is also used as a framework for a normative analysis of the selection of schemata. Finally, a few illustrations from international relations and especially foreign-policy formation show that this model of how people make sense out of a complex world can be directly relevant to the study of important political processes.

2014

Guidelines and Standard Procedures for Continuous Water-Quality Monitors: Station Operation, Record Computation, and Data Reporting

..........................................................................................................................................................

1972

Measures of Association for Cross Classifications, IV: Simplification of Asymptotic Variances

The asymptotic sampling theory discussed in our 1963 article [3] for measures of association presented in earlier articles [1, 2] turns on the derivation of asymptotic variances that may be complex and tedious in specific cases. In the present article, we simplify and unify these derivations by exploiting the expression of measures of association as ratios. Comments on the use of asymptotic variances, and on a trap in their calculation, are also given.

2006 - Journal of Geophysical Research

Spatiotemporal filtering using principal component analysis and Karhunen-Loeve expansion approaches for regional GPS network analysis

[1] Spatial filtering is an effective way to improve the precision of coordinate time series for regional GPS networks by reducing so-called common mode errors, thereby providing better resolution for detecting weak or transient deformation signals. The commonly used approach to regional filtering assumes that the common mode error is spatially uniform, which is a good approximation for networks of hundreds of kilometers extent, but breaks down as the spatial extent increases. A more rigorous approach should remove the assumption of spatially uniform distribution and let the data themselves reveal the spatial distribution of the common mode error. The principal component analysis (PCA) and the Karhunen-Loeve expansion (KLE) both decompose network time series into a set of temporally varying modes and their spatial responses. Therefore they provide a mathematical framework to perform spatiotemporal filtering. We apply the combination of PCA and KLE to daily station coordinate time series of the Southern California Integrated GPS Network (SCIGN) for the period 2000 to 2004. We demonstrate that spatially and temporally correlated common mode errors are the dominant error source in daily GPS solutions. The spatial characteristics of the common mode errors are close to uniform for all east, north, and vertical components, which implies a very long wavelength source for the common mode errors, compared to the spatial extent of the GPS network in southern California. Furthermore, the common mode errors exhibit temporally nonrandom patterns.

2001 - Inf. Syst.

The OO-method approach for information systems modeling: from object-oriented conceptual modeling to automated programming

Current and future (conventional) notations used in Conceptual Modeling Techniques should have a precise (formal) semantics to provide a well-defined software development process, in order to go from specification to implementation in an automated way. To achieve this objective, the OO-method approach to Information Systems Modeling presented in this paper attempts to overcome the conventional (informal)/formal dichotomy by selecting the best ideas from both approaches. The OO-method makes a clear distinction between the problem space (centered on what the system is) and the solution space (centered on how it is implemented as a software product). It provides a precise, conventional graphical notation to obtain a system description at the problem space level, however this notation is strictly based on a formal OO specification language that determines the conceptual modeling constructs needed to obtain the system specification. An abstract execution model determines how to obtain the software representations corresponding to these conceptual modeling constructs. In this way, the final software product can be obtained in an automated way.

2015 - Cerebral Cortex (New York, NY)

Functional Organization of Social Perception and Cognition in the Superior Temporal Sulcus

The superior temporal sulcus (STS) is considered a hub for social perception and cognition, including the perception of faces and human motion, as well as understanding others' actions, mental states, and language. However, the functional organization of the STS remains debated: Is this broad region composed of multiple functionally distinct modules, each specialized for a different process, or are STS subregions multifunctional, contributing to multiple processes? Is the STS spatially organized, and if so, what are the dominant features of this organization? We address these questions by measuring STS responses to a range of social and linguistic stimuli in the same set of human participants, using fMRI. We find a number of STS subregions that respond selectively to certain types of social input, organized along a posterior-to-anterior axis. We also identify regions of overlapping response to multiple contrasts, including regions responsive to both language and theory of mind, faces and voices, and faces and biological motion. Thus, the human STS contains both relatively domain-specific areas, and regions that respond to multiple types of social information.

1998 - IEEE Trans. Inf. Theory

Nonparametric Entropy Estimation for Stationary Processesand Random Fields, with Applications to English Text

We discuss a family of estimators for the entropy rate of a stationary ergodic process and prove their pointwise and mean consistency under a Doeblin-type mixing condition. The estimators are Cesaro averages of longest match-lengths, and their consistency follows from a generalized ergodic theorem due to Maker (1940). We provide examples of their performance on English text, and we generalize our results to countable alphabet processes and to random fields.

2001 - Water Resources Research

An analysis of terrestrial water storage variations in Illinois with implications for the Gravity Recovery and Climate Experiment (GRACE)

Variations in terrestrial water storage affect weather, climate, geophysical phenomena, and life on land, yet observation and understanding of terrestrial water storage are deficient. However, estimates of terrestrial water storage changes soon may be derived from observations of Earth's time-dependent gravity field made by NASA's Gravity Recovery and Climate Experiment (GRACE). Previous studies have evaluated that concept using modeled soil moisture and snow data. This investigation builds upon their results by relying on observations rather than modeled results, by analyzing groundwater and surface water variations as well as snow and soil water variations, and by using a longer time series. Expected uncertainty in GRACE-derived water storage changes are compared to monthly, seasonal, and annual terrestrial water storage changes estimated from observations in Illinois (145,800 km2). Assuming those changes are representative of larger regions, detectability is possible given a 200,000 km2 or larger area. Changes in soil moisture are typically the largest component of terrestrial water storage variations, followed by changes in groundwater plus intermediate zone storage.

论文关键词

neural network sensor network wireless sensor network wireless sensor deep learning comparative study base station information retrieval feature extraction sensor node programming language cellular network random field digital video number theory rate control network lifetime river basin hyperspectral imaging distributed algorithm chemical reaction carnegie mellon university fly ash visual feature boundary detection video retrieval diabetes mellitu semantic indexing oryza sativa water storage user association efficient wireles shot boundary shot boundary detection data assimilation system retrieval task controlled trial terrestrial television video search gps network sensor network consist efficient wireless sensor information retrieval task concept detection video captioning retrieval evaluation rice seed safety equipment endangered species station operation case study involving dublin city university high-level feature seed germination brown coal high plain study involving structure recognition climate experiment gravity recovery table structure land data assimilation instance search combinatorial number randomised controlled trial recovery and climate randomised controlled combinatorial number theory adult male high-level feature extraction complete proof music perception robust computation optimization-based method perception and cognition global land datum social perception terrestrial water storage trec video retrieval terrestrial water object-oriented conceptual video retrieval evaluation trec video seed variety base station operation table structure recognition transgenic rice concept detector total water storage groundwater storage regional gp grace gravity randomized distributed algorithm ibm tivoli workload scheduler cerebrovascular accident case study united state