This paper presents ObjRecombGA, a genetic algorithm framework for recombining related programs at the object file level. A genetic algorithm guides the selection of object files, while a robust link resolver allows working program binaries to be produced from the object files derived from two ancestor programs. Tests on compiled C programs, including a simple web browser and a well-known 3D video game, show that functional program variants can be created that exhibit key features of both ancestor programs. This work illustrates the feasibility of applying evolutionary techniques directly to commodity applications. Copyright 2010 ACM.
We describe a novel Distributed Storage protocol in Disruption (Delay) Tolerant Networks (DTN). Since DTNs can not guarantee the connectivity of the network all the time, distributed data storage and look up has to be performed in a store-and-forward way. In this work, we define local distributed location regions which are called cells to facilitate the data storage and look up process. Nodes in a cell have high probability of moving within their cells. Our protocol resorts to storing data items in cells which have hierarchical structure to reduce routing information storage at nodes. Multiple copies of a data item may be stored at nodes to counter the adverse impact of the nature of DTNs. The cells are relatively stable regions and as a result, data exchange overheads among nodes are reduced. Through experimentation, we show that the proposed distributed storage protocol achieves higher successful data storage ratios with lower delays and limited data item exchange requirements than other protocols in the literature.
There is a paradoxical relationship between the density of solar housing and net household energy use. The amount of solar energy available per person decreases as density increases. At the same time, transportation energy, and to some extent, household operating energy decreases. Thus, an interesting question is posed: how does net energy use vary with housing density? This study attempts to provide insight into this question by examining three housing forms: low-density detached homes, medium-density townhouses, and high-density high-rise apartments in Toronto. The three major quantities of energy that are summed for each are building operational energy use, solar energy availability, and personal transportation energy use. Solar energy availability is determined on the basis of an effective annual collector efficiency. The results show that under the base case in which solar panels are applied to conventional homes, the high-density development uses one-third less energy than the low-density one. Improving the efficiency of the homes results in a similar trend. Only when the personal vehicle fleet or solar collectors are made to be extremely efficient does the trend reverse-the low-density development results in lower net energy.
The design and implementation of security threat mitigation mechanisms in RFID systems, specially in low-cost RFID tags, are gaining great attention in both industry and academia. One main focus of research interests is the authentication and privacy techniques to prevent attacks targeting the insecure wireless channel of these systems. Cryptography is a key tool to address these threats. Nevertheless, strong hardware constraints, such as production costs, power consumption, time of response, and regulations compliance, makes the use of traditional cryptography in these systems a very challenging problem. The use of low-overhead procedures becomes the main approach to solve these challenging problems where traditional cryptography cannot fit. Recent results and trends, with an emphasis on lightweight techniques for addressing critical threats against low-cost RFID systems, are surveyed.
A set of sensors establishes barrier coverage of a given line segment if every point of the segment is within the sensing range of a sensor. Given a line segment I, n mobile sensors in arbitrary initial positions on the line (not necessarily inside I) and the sensing ranges of the sensors, we are interested in finding final positions of sensors which establish a barrier coverage of I so that the sum of the distances traveled by all sensors from initial to final positions is minimized. It is shown that the problem is NP complete even to approximate up to constant factor when the sensors may have different sensing ranges. When the sensors have an identical sensing range we give several efficient algorithms to calculate the final destinations so that the sensors either establish a barrier coverage or maximize the coverage of the segment if complete coverage is not feasible while at the same time the sum of the distances traveled by all sensors is minimized. Some open problems are also mentioned.
Autonomous agents require trust and reputation concepts in order to identify communities of agents with which to interact reliably in ways analogous to humans. Agent societies are invariably heterogeneous, with multiple decision making policies and actions governing their behaviour. Through the introduction of naive agents, this paper shows empirically that while learning agents can identify malicious agents through direct interaction, naive agents compromise utility through their inability to discern malicious agents. Moreover, the impact of the proportion of naive agents on the society is analyzed. The paper demonstrates that there is a need for witness interaction trust to detect naive agents in addition to the need for direct interaction trust to detect malicious agents. By proposing a set of policies, the paper demonstrates how learning agents can isolate themselves from naive and malicious agents.
We show that the tilted-grating-assisted excitation of surface plasmon polaritons on gold coated single-mode optical fibers depends strongly on the state of polarization of the core-guided light, even in fibers with cylindrical symmetry. Rotating the linear polarization of the guided light by 90° relative to the grating tilt plane is sufficient to turn the plasmon resonances on and off with more than 17 dB of extinction ratio. By monitoring the amplitude changes of selected individual cladding mode resonances we identify what we believe to be a new refractive index measurement method that is shown to be accurate to better than 5 × 10-5.
Model-based testing (MBT) is about testing a software system by using a model of its behaviour. To benefit fully from MBT, automation support is required. This paper presents a systematic review of prominent MBT tool support where we focus on tools that rely on state-based models. The systematic review protocol precisely describes the scope of the search and the steps involved in tool selection. Precisely defined criteria are used to compare
selected tools and comprise support for test coverage criteria, level of automation for various testing activities, and support for the construction of test scaffolding. The results of this review should be of interest to a wide range of stakeholders: software companies interested in selecting the most appropriate MBT tool for their needs; organizations willing to invest into creating MBT tool support; researchers interested in setting research directions.
This study uses an exploratory qualitative design to examine the lived experience of one group of service users on community treatment orders (CTOs). The study was designed and completed by four graduate students at Carleton University School of Social Work.
Despite the unique features of CTO legislation in Ontario, many findings from this study are remarkably similar to findings of research conducted in other jurisdictions. What is unique in our findings is the lack of focus on the actual conditions and provision of the CTO. The issue for our participants was less about the CTO itself, and more about the labels, control and discrimination associated with severe mental illness.
Cette étude utilise un concept qualitatif et exploratoire pour examiner les expériences vécues d’un groupe qui utilise les ordonnances de traitement en milieu communautaire (OTMC). Cette étude a été designée et complétée par 4 étudiants de l’école de service social de l’université Carleton.
Malgré les nombreux aspects uniques de la loi gérant les OTMC de l’Ontario, plusieurs résultats de cette étude sont remarquablement similaires aux résultats découverts dans de différentes juridictions. L’élément unique de cette recherche est le manque de focus sur les conditions véritables et les provisions des OTMC. La problématique encourue par les participants n’était pas au sujet des OTMC en soi, mais plus tôt au sujet de l’étiquetage, du contrôle, et de la discrimination associé aux troubles de santé mentale sévères.
This thesis studies transform-domain model-based algorithms to improve the feedback and noise control performance of hearing aids processing wideband speech in non-stationary environments. Subband adaptive filter structures are investigated for continuous and non-continuous adaptation feedback compensation, and the particle filter framework is used to develop state-space model-based speech enhancement algorithms.Subband feedback compensation is shown to offer several advantages. The flexibility offered by the frequency division allows subband systems to offer faster and more stable convergence for wideband signals, and better tracking of changing acoustic feedback paths. Robustness is also improved, as divergence caused by path changes or input signal correlation is confined to individual frequency bands.A Rao-Blackwellized particle filter (RBPF) algorithm is proposed for enhancement of speech discrete cosine transform (DCT) coefficients, and is evaluated in comparison to the standard fullband RBPF. The DCT subband decomposition is shown to enable improved modeling of wideband speech signals, especially in spectral troughs, thereby decreasing intra-speech noise in both best-case and real-world conditions.A novel particle filter algorithm is also proposed for short-time spectral amplitude (STSA) speech enhancement. A dynamic model of spectral amplitude evolution is used to allow for speech signal correlation in the frequency domain. Two variants of the basic algorithm are presented: the first incorporates phase information to improve the spectral amplitude estimates; the second uses interacting multiple models to account for speech presence uncertainty. The monaural algorithm is extended to the binaural case with a filter based speech and noise parameter estimator. The estimator exploits knowledge of the diffuse noise field coherence to separate the clean speech and noise power spectra without external noise or voice-activity estimation.In both feedback and noise control, the transform domain allows for processing strategies that take into account the diverse frequency-dependent characteristics of the wideband signals. Incorporating models of speech, noise and the acoustic environment allows the addition of time-domain constraints and a-priori knowledge to address signal non-stationarity. The developed algorithms are evaluated using real speech and noise signals recorded with a commercial hearing aid.