今日推荐

2003 - CAV

Interpolation and SAT-Based Model Checking

We consider a fully SAT-based method of unbounded symbolic model checking based on computing Craig interpolants. In benchmark studies using a set of large industrial circuit verification instances, this method is greatly more efficient than BDD-based symbolic model checking, and compares favorably to some recent SAT-based model checking methods on positive instances.

2005

Towards a Theory of Mobile Learning

Contact address to September 2005: Mike Sharples, Centre for Educational Technology and Distance Learning, University of Birmingham, B15 2TT. Tel: +44 121 414 3966. Contact address from September 2005: Mike Sharples, Learning Sciences Research Institute, University of Nottingham, Jubilee Campus, Wollaton Road, Nottingham, NG8 1BB, UK. Email: mike.sharples@nottingham.ac.uk There is a need to re-conceptualise learning for the mobile age, to recognise the essential role of mobility and communication in the process of learning, and also to indicate the importance of context in establishing meaning, and the transformative effect of digital networks in supporting virtual communities that transcend barriers of age and culture. In this paper we offer a framework for theorising about mobile learning, to complement theories of infant, classroom, workplace and informal learning. A related aim is to inform the design of new environments and technologies to support mobile learning, since the work described here has been developed through a series of projects to design mobile learning technology. In the tradition of Activity Theory we analyse learning as a cultural-historical activity system, mediated by tools that both constrain and support the learners in their goals of transforming their knowledge and skills. We separate two perspectives, or layers, of tool-mediated activity. The semiotic layer describes learning as a semiotic system in which the learner’s object-oriented actions are mediated by cultural tools and signs. The technological layer represents learning as an engagement with technology, in which tools such as computers and mobile phones function as interactive agents in the process of coming to know. These layers can be prised apart, to provide either a semiotic framework to promote discussion with educational theorists to analyse learning in the mobile age, or a technological framework for software developers and engineers to propose requirements for the design and evaluation of new mobile learning systems. Or the layers can be superimposed to examine the dynamics and co-evolution of learning and technology.

2008

Consequences of More Extreme Precipitation Regimes for Terrestrial Ecosystems

Amplification of the hydrological cycle as a consequence of global warming is forecast to lead to more extreme intra-annual precipitation regimes characterized by larger rainfall events and longer intervals between events. We present a conceptual framework, based on past investigations and ecological theory, for predicting the consequences of this underappreciated aspect of climate change. We consider a broad range of terrestrial ecosystems that vary in their overall water balance. More extreme rainfall regimes are expected to increase the duration and severity of soil water stress in mesic ecosystems as intervals between rainfall events increase. In contrast, xeric ecosystems may exhibit the opposite response to extreme events. Larger but less frequent rainfall events may result in proportional reductions in evaporative losses in xeric systems, and thus may lead to greater soil water availability. Hydric (wetland) ecosystems are predicted to experience reduced periods of anoxia in response to prolonged intervals between rainfall events. Understanding these contingent effects of ecosystem water balance is necessary for predicting how more extreme precipitation regimes will modify ecosystem processes and alter interactions with related global change drivers.

2010 - PLoS ONE

Finding Statistically Significant Communities in Networks

Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks.

1997 - POPL '97

Model checking for programming languages using VeriSoft

Verification by state-space exploration, also often referred to as "model checking", is an effective method for analyzing the correctness of concurrent reactive systems (e.g., communication protocols). Unfortunately, existing model-checking techniques are restricted to the verification of properties of models, i.e., abstractions, of concurrent systems.In this paper, we discuss how model checking can be extended to deal directly with "actual" descriptions of concurrent systems, e.g., implementations of communication protocols written in programming languages such as C or C++. We then introduce a new search technique that is suitable for exploring the state spaces of such systems. This algorithm has been implemented in VeriSoft, a tool for systematically exploring the state spaces of systems composed of several concurrent processes executing arbitrary C code. As an example of application, we describe how VeriSoft successfully discovered an error in a 2500-line C program controlling robots operating in an unpredictable environment.

1989 - J. Documentation

A Behavioural Approach to Information Retrieval System Design

A behavioural approach to information retrieval system design is outlined based on the derivation of a behavioural model of the information seeking patterns of academic social scientists. The information seeking patterns of a variety of academic social scientists were broken down into six characteristics: starting, chaining, browsing, differentiating, monitoring, and extracting. These characteristics constitute the principal generic features of the different individual patterns, and together provide a flexible behavioural model for information retrieval system design. The extent to which these characteristics are available on existing systems is considered, and the requirements for implementing the features on an experimental system are set out.

2015 - arXiv: Artificial Intelligence

Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks

One long-term goal of machine learning research is to produce methods that are applicable to reasoning and natural language, in particular building an intelligent dialogue agent. To measure progress towards that goal, we argue for the usefulness of a set of proxy tasks that evaluate reading comprehension via question answering. Our tasks measure understanding in several ways: whether a system is able to answer questions via chaining facts, simple induction, deduction and many more. The tasks are designed to be prerequisites for any system that aims to be capable of conversing with a human. We believe many existing learning systems can currently not solve them, and hence our aim is to classify these tasks into skill sets, so that researchers can identify (and then rectify) the failings of their systems. We also extend and improve the recently introduced Memory Networks model, and show it is able to solve some, but not all, of the tasks.

2006

Social Intelligence: The New Science of Human Relationships

0 阅读

Author Daniel Goleman explores the manner in which the brain is designed to engage in brain-to-brain “hookups” with others, and how these interactions affect both our social interactions and physical/mental well being. Based upon conceptualizations pioneered by Edward Thorndike, Goleman analyzes a traditional concept of social intelligence for the purpose of developing a revised model that consists of two categories: Social awareness (e.g., assessing the feelings of others) and social facility (e.g., awareness of how people present themselves). Goleman also explores advances in neuroscience that have made it possible for scientists and psychologists to study the ways in which emotions and biology work together.

1988 - Journal of Environmental Economics and Management

A New Paradigm for Valuing Non-market Goods Using Referendum Data: Maximum Likelihood Estimation by Censored Logistic Regression'

This paper challenges the W. M. Hanemann [ Amer. J. Agr. Econom. 66 , 332–341 (1984)] and C. Sellar, J. P. Chavas, and J. R. Stoll [ J. Environ. Econom. Management 13 , 382–390 (1986)] utilizations of logit models to estimate the value of non-market resources from “referendum” survey data. These data are more informative than conventional choice data. The “random utility” interpretation of logit models is therefore too restrictive. Bypassing the utility function entirely, it will be shown that parameters and standard errors for utility-theoretic inverse Hicksian demand functions can be extracted directly and much more simply. Estimated demand functions need not be limited to those corresponding to the linear-in-parameters utility difference specifications which can be handled by packaged logit programs.

2000 - IEEE transactions on image processing : a publication of the IEEE Signal Processing Society

The Bayesian image retrieval system, PicHunter: theory, implementation, and psychophysical experiments

This paper presents the theory, design principles, implementation and performance results of PicHunter, a prototype content-based image retrieval (CBIR) system. In addition, this document presents the rationale, design and results of psychophysical experiments that were conducted to address some key issues that arose during PicHunter's development. The PicHunter project makes four primary contributions to research on CBIR. First, PicHunter represents a simple instance of a general Bayesian framework which we describe for using relevance feedback to direct a search. With an explicit model of what users would do, given the target image they want, PicHunter uses Bayes's rule to predict the target they want, given their actions. This is done via a probability distribution over possible image targets, rather than by refining a query. Second, an entropy-minimizing display algorithm is described that attempts to maximize the information obtained from a user at each iteration of the search. Third, PicHunter makes use of hidden annotation rather than a possibly inaccurate/inconsistent annotation structure that the user must learn and make queries in. Finally, PicHunter introduces two experimental paradigms to quantitatively evaluate the performance of the system, and psychophysical experiments are presented that support the theoretical claims.

2003 - RFC

TCP Friendly Rate Control (TFRC): Protocol Specification

This document specifies TCP-Friendly Rate Control (TFRC). TFRC is a congestion control mechanism for unicast flows operating in a best- effort Internet environment. It is reasonably fair when competing for bandwidth with TCP flows, but has a much lower variation of throughput over time compared with TCP, making it more suitable for applications such as telephony or streaming media where a relatively smooth sending rate is of importance.

2003 - New Directions in Question Answering

TimeML: Robust Specification of Event and Temporal Expressions in Text

In this paper we provide a description of TimeML, a rich specification language for event and temporal expressions in natural language text, developed in the context of the AQUAINT program on Question Answering Systems. Unlik em ost previous work on event annotation, TimeML capture st hree distinct phenomena in temporal markup: (1) it systematically anchors event predicates to a broad range of temporally denotating expressions; (2) it orders event expressions in text relative to one another, both intrasententially and in discourse; and (3) it allows for a delayed (underspecified) interpretation of partially determined temporal expressions. We demonstrate the expressiveness of TimeML for a broad range of syntactic and semantic contexts, including aspectual predication, modal subordination, and an initial treatment of lexical and constructional causation in text.

1992 - Econometrica

Trimmed LAD and Least Squares Estimation of Truncated and Censored Regression Models with Fixed Effects

This paper considers estimation of truncated.and censored regression models with fixed effects. Up until now, no estimator has been shown to be consistent as the cross-section dimension increases with the time dimension fixed. Trimmed least absolute deviations and trimmed least squares estimators are proposed for the case where the panel is of length two, and it is proven that they are consistent and asymptotically normal. It is not necessary to maintain parametric assumptions on the error terms to obtain this result. A small scale Monte Carlo study demonstrates that these estimators can perform well in small samples. Copyright 1992 by The Econometric Society.

1994 - Inf. Comput.

Symbolic Model Checking for Real-Time Systems

We describe finite-state programs over real-numbered time in a guarded-command language with real-valued clocks or, equivalently, as finite automata with real-valued clocks. Model checking answers the question which states of a real-time program satisfy a branching-time specification (given in an extension of CTL with clock variables). We develop an algorithm that computes this set of states symbolically as a fixpoint of a functional on state predicates, without constructing the state space. For this purpose, we introduce a μ-calculus on computation trees over real-numbered time. Unfortunately, many standard program properties, such as response for all nonzeno execution sequences (during which time diverges), cannot be characterized by fixpoints: we show that the expressiveness of the timed μ-calculus is incomparable to the expressiveness of timed CTL. Fortunately, this result does not impair the symbolic verification of "implementable" real-time programs-those whose safety constraints are machine-closed with respect to diverging time and whose fairness constraints are restricted to finite upper bounds on clock values. All timed CTL properties of such programs are shown to be computable as finitely approximable fixpoints in a simple decidable theory.

1975 - Journal of applied physiology

Respiratory sinus arrhythmia: noninvasive measure of parasympathetic cardiac control.

The degree of parasympathetic heart rate control, PC, was defined as the decrease in average heart period (RR interval) caused by the elimination of parasympathetically mediated influences on the heart while keeping sympathetic activity unchanged. By reviewing published results on the interaction of sympathetic and parasympathetic heart rate control, the prediction was made that PC should be directly proportional to VHP, the peak-to-peak variations in heart period caused by spontaneous respiration. In sevel chloralose/urethan-anesthetized dogs the vagi were reversibly blocked by cooling, and PC (the difference between average heart period before and after cooling) and VHP (without cooling) were determined under a variety of conditions that included a) increasing vagal activity by elevating the blood pressure b) sympathetic blockade, and c) parasympathetic blockade. The relationship between VHP and PC was linear with an average correlation coefficient of 0.969 +/- 0.024 (SD) and a PC-axis intercept of 15.2 +/- 25.9 ms. In each dog the correlation coefficient between VHP and PC was higher than between VHP and the average heart period (avg correlation coef: 0.914 +/- 0.044). These results suggest that the degree of respiratory sinus arrhythmia may be used as a noninvasive indicator of the degree of parasympathetic cardiac control.

1978

Introduction to the Theory of Thermal Neutron Scattering

0 阅读

1. Introduction 2. Nuclear scattering - basic theory 3. Nuclear scattering by crystals 4. Correlation functions in nuclear scattering 5. Scattering by liquids 6. Neutron optics 7. Magnetic scattering - basic theory 8. Scattering from magnetically ordered crystals 9. Polarisation analysis Appendices Solutions to examples Index.

2006 - TACAS

PRISM: A Tool for Automatic Verification of Probabilistic Systems

Probabilistic model checking is an automatic formal verification technique for analysing quantitative properties of systems which exhibit stochastic behaviour. PRISM is a probabilistic model checking tool which has already been successfully deployed in a wide range of application domains, from real-time communication protocols to biological signalling pathways. The tool has recently undergone a significant amount of development. Major additions include facilities to manually explore models, Monte-Carlo discrete-event simulation techniques for approximate model analysis (including support for distributed simulation) and the ability to compute cost- and reward-based measures, e.g. “the expected energy consumption of the system before the first failure occurs”. This paper presents an overview of all the main features of PRISM. More information can be found on the website: www.cs.bham.ac.uk/~dxp/prism.

2014 - Cognitive Computation

An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels

Extreme learning machines (ELMs) basically give answers to two fundamental learning problems: (1) Can fundamentals of learning (i.e., feature learning, clustering, regression and classification) be made without tuning hidden neurons (including biological neurons) even when the output shapes and function modeling of these neurons are unknown? (2) Does there exist unified framework for feedforward neural networks and feature space methods? ELMs that have built some tangible links between machine learning techniques and biological learning mechanisms have recently attracted increasing attention of researchers in widespread research areas. This paper provides an insight into ELMs in three aspects, viz: random neurons, random features and kernels. This paper also shows that in theory ELMs (with the same kernels) tend to outperform support vector machine and its variants in both regression and classification applications with much easier implementation.

2001 - Formal Methods in System Design

Bounded Model Checking Using Satisfiability Solving

The phrase model checking refers to algorithms for exploring the state space of a transition system to determine if it obeys a specification of its intended behavior. These algorithms can perform exhaustive verification in a highly automatic manner, and, thus, have attracted much interest in industry. Model checking programs are now being commercially marketed. However, model checking has been held back by the state explosion problem, which is the problem that the number of states in a system grows exponentially in the number of system components. Much research has been devoted to ameliorating this problem.In this tutorial, we first give a brief overview of the history of model checking to date, and then focus on recent techniques that combine model checking with satisfiability solving. These techniques, known as bounded model checking, do a very fast exploration of the state space, and for some types of problems seem to offer large performance improvements over previous approaches. We review experiments with bounded model checking on both public domain and industrial designs, and propose a methodology for applying the technique in industry for invariance checking. We then summarize the pros and cons of this new technology and discuss future research efforts to extend its capabilities.

2014

Future changes to the intensity and frequency of short‐duration extreme rainfall

Evidence that extreme rainfall intensity is increasing at the global scale has strengthened considerably in recent years. Research now indicates that the greatest increases are likely to occur in short‐duration storms lasting less than a day, potentially leading to an increase in the magnitude and frequency of flash floods. This review examines the evidence for subdaily extreme rainfall intensification due to anthropogenic climate change and describes our current physical understanding of the association between subdaily extreme rainfall intensity and atmospheric temperature. We also examine the nature, quality, and quantity of information needed to allow society to adapt successfully to predicted future changes, and discuss the roles of observational and modeling studies in helping us to better understand the physical processes that can influence subdaily extreme rainfall characteristics. We conclude by describing the types of research required to produce a more thorough understanding of the relationships between local‐scale thermodynamic effects, large‐scale atmospheric circulation, and subdaily extreme rainfall intensity.

论文关键词

time series software development information retrieval regression model image retrieval maximum likelihood knowledge base retrieval system model checking distance learning real-time system question answering extreme learning machine learning machine information retrieval system extreme learning order statistic content-based image retrieval temporal logic rate control formal method statistical inference weibull distribution nuclear reactor visual attention image retrieval system question answering system carnegie mellon university binary decision diagram java virtual machine answering system atrial fibrillation carnegie mellon memory network random sequence mellon university extreme programming southeast asia research issue model checker extreme event belief revision visual question answering bounded model checking symbolic model visual question abstract model extreme value theory bounded model symbolic model checking automated storage statistically significant bibliography index arithmetic logic unit model checking technique extreme value distribution model checking algorithm extreme weather south pacific interactive information retrieval sample variance multivariate extreme open-domain question answering model checking based state of knowledge extreme temperature answering question question answering dataset extreme rainfall open-domain question question answering track extreme precipitation daily temperature logic model checking answering track symbolic model checker desired property counterexample-guided abstraction refinement sat-based model checking temperature extreme extreme precipitation event climate extreme formal methods community extreme storm climate event sat-based model precipitation extreme french polynesia image question answering lazy abstraction severe thunderstorm modeling of extreme silo (dataset) pipeline (computing) word list by frequency reactor device component reactor (software) united state