The purpose of this article is to improve understanding of internationalization as a strategic response to the catalysts of globalization and the knowledge society. The paper will attempt to critically identify and interpret how the aforementioned elements are being recontextualized and translated into responsive internationalization policies and systemic institutional change. The article takes a critical analysis approach on current internationalization efforts and provides a conceptual framework for developing a performance indicator set through a combination of institutional change theory (North 1990) and the Delta cycle for internationalization (Rumbley 2010). Recommendations on future research areas are made at the conclusion of the article.
Few people have bothered to defend the Majoritarian, winner take all character of the current Canadian electoral system. This parliamentary system has been in existence in the same form since the founding of the modern state in 1867. In these remarks, I offer a defense of Majoritarianism in the Canadian context when the alternative is some form of Proportional Representation. These remarks were prepared as an opening statement in a debate on electoral reform at a Faculty of Public Affairs 75th Anniversary conference at Carleton University, March 3, 2017.
The debate arose because of the Prime Minister's announced intention to replace the current system with some other during the election campaign that led to his victory in 2015. The debate occurred a few months after the release of a lengthy report on electoral reform by a special allparty committee of the House of Commons. A few weeks before the debate, the Prime Minister announced (independently of the debate, of course) that his government would no longer pursue electoral reform, perhaps because it looked like he would not be able to avoid a referendum, a process which is hard to control. In any event, and especially in the light of recent attempts to change the system both at the federal level and in some provinces, I think it is important for people to understand that the existing electoral system is a sensible one that likely will continue to serve us well.
Since 2014, Carleton University Library has been adding to the ways it practices collection development. In addition to the subject liaison firm order model, we have added 3 successful user-centred ways to acquire material. We ended our approval plan and used its selection framework to create a DDA plan. We started a textbook purchasing program in Reserves, and we instituted print purchase on demand procedures in ILL. This poster provides an overview and key takeaways for each initiative.
Libraries are quickly becoming spaces for more than just books and journals. At Carleton University MacOdrum Library, we used Minecraft to introduce elementary and high school students to the power of gaming as a tool to foster education, research and collaboration. In May 2015, we encouraged students to take part in a project that engaged them with a local project called the LeBreton Flats Redevelopment Project. The redevelopment project led by the National Capital Commission (NCC), shortlisted four developers and published their proposals for the community to see. Using the criteria presented by the four pre-qualified proponents, the students were asked to research and propose their own ideas for the space. Using a scale version of the space in Minecraft, the students built their proposed plan for the space in a 1:1 scale replica of LeBreton Flats.
Usable security has unique usability challenges because the need for security often means that standard human-computerinteraction approaches cannot be directly applied. An important usability goal for authentication systems is to support users in selecting better passwords, thus increasing security by expanding the effective password space. In click-based graphical passwords, poorly chosen passwords lead to the emergence of hotspots ' portions of the image where users are more likely to select click-points, allowing attackers to mount more successful dictionary attacks. We use persuasion to influence user choice in click-based graphical passwords, encouraging users to select more random, and hence more secure, click-points. Our approach is to introduce persuasion to the Cued Click-Points graphical password scheme (Chiasson, van Oorschot, Biddle, 2007). Our resulting scheme significantly reduces hotspots while still maintaining its usability.
Energy modeling and optimization studies can facilitate the design of cost-effective, low-energy buildings. However, this process inevitably involves uncertainties such as predicting occupant behavior, future climate, and econometric parameters. As presently practiced, energy modelers typically do not quantify the implications of these unknowns into performance outcomes. This paper describes an energy modeling approach to quantify economic risk and better inform decision makers of the economic feasibility of a project. The proposed methodology suggests how economic uncertainty can be quantified within an optimization framework. This approach improves modeling outcomes by factoring in the effect of variability in assumptions and improves confidence in simulation results. The methodology is demonstrated using a net zero energy commercial office building case study located in London, ON, Canada.
The use of open linked data in libraries is quickly developing as means of connecting digital content from the web to local library collections. In the world of cataloguing, metadata, and authority control, using controlled vocabularies through open linked data presents the possibility of providing library patrons with access to a seemingly unlimited expanse of digital resources. Encouraged by this potential, the Carleton University Library is currently implementing open linked data models within its institutional repository in order to connect users to digital content within our repository, our ILS, and beyond. This poster presents the ideas and processes behind this innovative project, and hopes to inspire other libraries to implement open linked data concepts in order to enhance the discoverability of their own digital collections.
Learning Outcomes:
• Clear explanation of open linked data concepts using diagrams to illustrate key points
• How libraries of all sizes can utilize linked data for authority control to expand access to digital collections
• How libraries can use linked data to promote and expand access to OA publications
Energy models are commonly used to examine the multitude of pathways to improve building performance. As presently practiced, a deterministic approach is used to evaluate incremental design improvements to achieve performance targets. However, significant insight can be gained by examining the implications of modeling assumptions using a probabilistic approach. Analyzing the effect of small perturbations on the inputs of energy and economic models can improve decision making and modeler confidence in building simulation results. This paper describes a reproducible methodology which AIDS modelers in identifying energy and economic uncertainties caused by variabilities in solar exposure. Using an optimization framework, uncertainty is quantified across the entire simulation solution space. This approach improves modeling outcomes by factoring in the effect of variability in assumptions and improves confidence in simulation results. The methodology is demonstrated using a net zero energy commercial office building case study.
Net-zero energy is an influential idea in guiding the building stock towards renewable
energy resources. Increasingly, this target is scaled to entire communities
which may include dozens of buildings in each new development phase.
Although building energy modelling processes and codes have been well developed
to guide decision making, there is a lack of methodologies for community
integrated energy masterplanning. The problem is further complicated by the
availability of district systems which better harvest and store on-site renewable
energy. In response to these challenges, this paper contributes an energy modelling
methodology which helps energy masterplanners determine trade-offs between
building energy saving measures and district system design. Furthermore,
this paper shows that it is possible to mitigate electrical and thermal peaks of a
net-zero energy community using minimal district equipment. The methodology
is demonstrated using a cold-climate case-study with both significant heating/
cooling loads and solar energy resources.
This paper presents a multi-objective redesign case study of an archetype solar house based on a near net zero energy (NZE) demonstration home located in Eastman, Quebec. Using optimization techniques, pathways are identified from the original design to both cost and energy optimal designs. An evolutionary algorithm is used to optimize trade-offs between passive solar gains and active solar generation, using two objective functions: net-energy consumption and life-cycle cost over a thirty-year life cycle. In addition, this paper explores different pathways to net zero energy based on economic incentives, such as feed-in tariffs for on-site electricity production from renewables. The main objective is to identify pathways to net zero energy that will facilitate the future systematic design of similar homes based on the concept of the archetype that combines passive solar design; energy-efficiency measures, including a geothermal heat pump; and a building-integrated photovoltaic system. Results from this paper can be utilized as follows: (1) systematic design improvements and applications of lessons learned from a proven NZE home design concept, (2) use of a methodology to understand pathways to cost and energy optimal building designs, and (3) to aid in policy development on economic incentives that can positively influence optimized home design.
Net zero energy (NZE) communities are becoming pivotal to the energy vision of developers. Communities that produce as much energy as they consume provide many benefits, such as reducing life-cycle costs and better resilience to grid outages. If deployed using smart-grid technology, NZE communities can act as a grid node and aid in balancing electrical demand. However, identifying cost-effective pathways to NZE requires detailed energy and economic models. Information required to build such models is not typically available at the early master-planning stages, where the largest energy and economic saving opportunities exist. Methodologies that expedite and streamline energy and economic modeling could facilitate early decision making. This paper describes a reproducible methodology that aids modelers in identifying energy and economic savings opportunities in the early community design stages. As additional information becomes available, models can quickly be recreated and evaluated. The proposed methodology is applied to the first-phase design of a NZE community under development in Southwestern Ontario.
Resource Description and Access is the new content standard coming Spring 2013, with national libraries using RDA effective March 30, 2013. Libraries need to address training for staff in all departments on how to interpret, catalogue and use RDA records.
There have been a number of steganography embedding techniques proposed over the past few years. In turn, there has been great interest in steganalysis techniques as the embedding techniques improve. Specifically, universal steganalysis techniques have become more attractive since they work independently of the embedding technique. In this work, we examine the effectiveness of a basic universal technique that relies on some knowledge about the cover media, but not the embedding technique. We consider images as a cover media, and examine how a single technique that we call steganographic sanitization performs on 26 different steganography programs that are publicly available on the Internet. Our experiments are completed using a number of secret messages and a variety of different levels of sanitization. However, since our intent is to remove covert communication, and not authentication information, we examine how well the sanitization process preserves authentication information such as watermarks and digital fingerprints.
In this work we discuss our efforts to use the ubiquity of smart phone systems and the mobility they provide to stream historical information about your current place on the earth to the end user. We propose the concept of timescapes to portray this historical significance of where they are standing and allow a brief travel through time. By combining GPS location, with a rich media interpretation of existing historical documents, historical facts become an on-demand resource available to travellers, school children, historians and any interested 3rd party. To our knowledge this is the first introduction of the term timescape to be used in the context of historical information pull. Copyright
New threats to networks are constantly arising. This justifies protecting network assets and mitigating the risk associated with attacks. In a distributed environment, researchers aim, in particular, at eliminating faulty network entities. More specifically, much research has been conducted on locating a single static black hole, which is defined as a network site whose existence is known a priori and that disposes of any incoming data without leaving any trace of this occurrence. However, the prevalence of faulty nodes requires an algorithm able to (a) identify faulty nodes that can be repaired without human intervention and (b) locate black holes, which are taken to be faulty nodes whose repair does require human intervention. In this paper, we consider a specific attack model that involves multiple faulty nodes that can be repaired by mobile software agents, as well as a virus v that can infect a previously repaired faulty node and turn it into a black hole. We refer to the task of repairing multiple faulty nodes and pointing out the location of the black hole as the Faulty Node Repair and Dynamically Spawned Black Hole Search. Wefirst analyze the attack model we put forth. We then explain (a) how to identify whether a node is either (1) a normal node or (2) a repairable faulty node or (3) the black hole that has been infected by virus v during the search/repair process and, (b) how to perform the correct relevant actions. These two steps constitute a complex task, which, we explain, significantly differs from the traditional Black Hole Search. We continue by proposing an algorithm to solve this problem in an asynchronous ring network with only one whiteboard (which resides in a node called the homebase). We prove the correctness of our solution and analyze its complexity by both theoretical analysis and experiment evaluation. We conclude that, using our proposed algorithm, b + 4 agents can repair all faulty nodes and locate the black hole infected by a virus v within finite time. Our algorithm works even when the number of faulty nodes b is unknown a priori.
Semantic scene classification is a challenging problem in computer vision. In this paper, we present a novel multi-level active learning approach to reduce the human annotation effort for training robust scene classification models. Different from most existing active learning methods that can only query labels for selected instances at the target categorization level, i.e., the scene class level, our approach establishes a semantic framework that predicts scene labels based on a latent object-based semantic representation of images, and is capable to query labels at two different levels, the target scene class level (abstractive high level) and the latent object class level (semantic middle level). Specifically, we develop an adaptive active learning strategy to perform multi-level label query, which maintains the default label query at the target scene class level, but switches to the latent object class level whenever an "unexpected" target class label is returned by the labeler. We conduct experiments on two standard scene classification datasets to investigate the efficacy of the proposed approach. Our empirical results show the proposed adaptive multi-level active learning approach can outperform both baseline active learning methods and a state-of-the-art multi-level active learning method.
The field of game playing is a particularly well-studied area within the context of AI, leading to the development of powerful techniques, such as the alpha-beta search, capable of achieving competitive game play against an intelligent opponent. It is well known that tree pruning strategies, such as alpha-beta, benefit strongly from proper move ordering, that is, searching the best element first. Inspired by the formerly unrelated field of Adaptive Data Structures (ADSs), we have previously introduced the History-ADS technique, which employs an adaptive list to achieve effective and dynamic move ordering, in a domain independent fashion, and found that it performs well in a wide range of cases. However, previous work did not compare the performance of the History-ADS heuristic to any established move ordering strategy. In an attempt to address this problem, we present here a comparison to two well-known, acclaimed strategies, which operate on a similar philosophy to the History-ADS, the History Heuristic, and the Killer Moves technique. We find that, in a wide range of two-player and multi-player games, at various points in the game’s progression, the History-ADS performs at least as well as these strategies, and, in fact, outperforms them in the majority of cases.