The article scrutinizes the complex entanglement of cyberurban spaces in the making and development of contemporary social movement by analyzing its imaginaries, practices, and trajectories.
This issue of New Geographies, “Geographies of Information” (edited by Taraneh Meskhani & Ali Fard), presents a new set of frameworks that refrain from generalizations to highlight the many facets of the socio-technical constructions, processes, and practices that form the spaces of information and communication. In addition to Lim, contributors of the issue include prominent thinkers and scholars in various related disciplines such as Rob Kitchin (critical data), Stephen Graham (urbanism) and Malcolm McCullough (architecture/urban computing).
An earlier version of this paper was prepared for a Symposium, Cultural Policies in Regional Integration, sponsored by the Center for the Study of Western Hemispheric Trade and the Mexican Center of the Institute of Latin American Studies, The University of Texas at Austin, February 2, 1998.
Energy models are commonly used to examine the multitude of pathways to improve building performance. As presently practiced, a deterministic approach is used to evaluate incremental design improvements to achieve performance targets. However, significant insight can be gained by examining the implications of modeling assumptions using a probabilistic approach. Analyzing the effect of small perturbations on the inputs of energy and economic models can improve decision making and modeler confidence in building simulation results. This paper describes a reproducible methodology which AIDS modelers in identifying energy and economic uncertainties caused by variabilities in solar exposure. Using an optimization framework, uncertainty is quantified across the entire simulation solution space. This approach improves modeling outcomes by factoring in the effect of variability in assumptions and improves confidence in simulation results. The methodology is demonstrated using a net zero energy commercial office building case study.
Net-zero energy is an influential idea in guiding the building stock towards renewable
energy resources. Increasingly, this target is scaled to entire communities
which may include dozens of buildings in each new development phase.
Although building energy modelling processes and codes have been well developed
to guide decision making, there is a lack of methodologies for community
integrated energy masterplanning. The problem is further complicated by the
availability of district systems which better harvest and store on-site renewable
energy. In response to these challenges, this paper contributes an energy modelling
methodology which helps energy masterplanners determine trade-offs between
building energy saving measures and district system design. Furthermore,
this paper shows that it is possible to mitigate electrical and thermal peaks of a
net-zero energy community using minimal district equipment. The methodology
is demonstrated using a cold-climate case-study with both significant heating/
cooling loads and solar energy resources.
This paper presents a multi-objective redesign case study of an archetype solar house based on a near net zero energy (NZE) demonstration home located in Eastman, Quebec. Using optimization techniques, pathways are identified from the original design to both cost and energy optimal designs. An evolutionary algorithm is used to optimize trade-offs between passive solar gains and active solar generation, using two objective functions: net-energy consumption and life-cycle cost over a thirty-year life cycle. In addition, this paper explores different pathways to net zero energy based on economic incentives, such as feed-in tariffs for on-site electricity production from renewables. The main objective is to identify pathways to net zero energy that will facilitate the future systematic design of similar homes based on the concept of the archetype that combines passive solar design; energy-efficiency measures, including a geothermal heat pump; and a building-integrated photovoltaic system. Results from this paper can be utilized as follows: (1) systematic design improvements and applications of lessons learned from a proven NZE home design concept, (2) use of a methodology to understand pathways to cost and energy optimal building designs, and (3) to aid in policy development on economic incentives that can positively influence optimized home design.
Net zero energy (NZE) communities are becoming pivotal to the energy vision of developers. Communities that produce as much energy as they consume provide many benefits, such as reducing life-cycle costs and better resilience to grid outages. If deployed using smart-grid technology, NZE communities can act as a grid node and aid in balancing electrical demand. However, identifying cost-effective pathways to NZE requires detailed energy and economic models. Information required to build such models is not typically available at the early master-planning stages, where the largest energy and economic saving opportunities exist. Methodologies that expedite and streamline energy and economic modeling could facilitate early decision making. This paper describes a reproducible methodology that aids modelers in identifying energy and economic savings opportunities in the early community design stages. As additional information becomes available, models can quickly be recreated and evaluated. The proposed methodology is applied to the first-phase design of a NZE community under development in Southwestern Ontario.
Police in schools In an era where the costs of policing are constantly under scrutiny from governing municipalities, the time has come for police agencies to re-evaluate the services they provide. To do this, they need to answer questions relating to the value that different activities they perform create in the communities they serve. In other words, they need to change the focus of the conversation from “what does this service cost” to “what value does this service provide.”
This document summarizes key findings from a longitudinal (2014-2017), multi-method (quantitative, qualitative, and ethnographic analysis, along with a Social Return on Investment [SROI] analysis) case study undertaken to identify the value of School Resource Officers (SROs) that are employed by Peel Regional Police and work in the service’s Neighborhood Police Unit (NPU). Of note is the application of SROI techniques in this evaluation process. SROI, a methodology that emerged from the not-for-profit sector, helps researchers identify sources of value outside of those considered through traditional valuation techniques, such as cost-benefit analysis.
Evaluation of Peel Police’s SRO program was motivated by a number of factors. First, the costs of this program are both easy to identify and significant (just over $9 million per year). Second, it is very challenging to identify the value that this program provides to students and the community. The challenges of quantifying the value offered by assigning full-time SROs to Canadian high schools is evidenced by the fact that such programs are rare, as police services around the world have responded to pressures to economize by removing officers from schools and either eliminating the role of the SRO or having one officer attend to many schools.
Carleton University Library has an innovative staff development program to expand the skill set of e-book cataloguers to provide a comprehensive service to manage and expand access e-books. In 2009 Carleton University Library hired its first e-book cataloguer in response to the rapid growth of digital resources in the Library collection; a second position was added in 2011. These positions have successfully evolved to incorporate a wide variety of duties related to e-books in response to rapidly changing digital environment. Conference poster presented at the CLA annual conference, June 3 to 5, 2015 in Ottawa, Ontario.
Resource Description and Access is the new content standard coming Spring 2013, with national libraries using RDA effective March 30, 2013. Libraries need to address training for staff in all departments on how to interpret, catalogue and use RDA records.
My study attempted to find out if the old part of our brain (limbic system) had a
significant role in influencing how we detect the valence of blurry words without
conscious awareness of what the words are. 10 participants were shown blurry words that
could not be read and were asked to guess valence, without a time limit. The hypotheses
for this study was that participants would be accurate in detecting valence of blurred
words and that participants would rate negative words the most accurately. I also
predicted that participants would attempt to read words before rating valence and they
would attempt to read the words only in the beginning. The stimuli were shown to the
participants on printed-paper. There were 10 blurred words per page with accompanying
5-point Likert scales by each blurred word with a reference scale at the top of every page.
My research data found that there was a significant statistical difference between people’s
ability to detect the valence of blurred words compared to the normal ability (which is
100% accuracy). The comparison showed that the participants were significantly worse at
detecting the valence of blurred words than unblurred words. There was no significant
statistical difference between people’s ability to detect the valence of blurry neutral
words compared to the valence of blurry nonsensical words. Participants were equally
accurate at both of these word-types. Participant responses also showed that they were
statistically better at detecting the valence of negative blurry words than positive blurry
words. So they were better at detecting negative valence than those of other valences.
The Iowa Gamb
ling Task (IGT) is widely used to assess the role of
emotion in decision making. However, there is only indirect
evidence to support that the task measures emotion. There are
inconsistencies in performance within in healthy populations who
display risk tak
ing traits. Two hundred and fifty participants were
assessed for psychopathy, sensation seeking, and impulsiveness.
The IGT was compared with modified versions that directly
manipulated emotion within in the task by indexing reward and
punishment cards wit
h images varying in emotional content.
Participants continued to learn to avoid risky decks in all versions
of the IGT. The manipulation of emotional content within the task
did affect performance: fearful images contributed to greater risky
deck selection
s. Across the tasks, psychopathy showed the
strongest relationship to risky deck selections, and lower levels of
psychopathy was associated decreased risky deck selections.
However, psychopathy did not affect learning in the modified
versions. Exploratory analysis on image valance found that
negative images (compared to neutral) benefited learning for
individuals with higher levels of psychopathy. Discussion will
center on the benefits of manipulating emotion directly within the
task as a means to assess th
e validity of the IGT.
The debate surrounding how emotion and c
ognition
are
organized in the brain often lead
s to
Damasio’s Somatic
Marker Hypothesis. This theory endorses a highly interactive
process between emotion and cognition, but has been
criticized for being too broad to capture the specific links
between the t
wo. It also implies that emotion operates from a
neural architecture that is dissociable from cognition.
Although empirical findings from the Iowa Gambling Task
lend support for the theory, this can promote a false
dichotomy between emotion and cognition. Issues will be
raised regarding
the
view
that the theory and the task are ill
-
formulated to account for the phases of decision making.
Further theoretical work may be required to align the task
with Damasio’s view of emotion as integrated with cognition.
First edition published on the occasion of an exhibition of R.L. Griffith's paintings at Wallack Galleries in Ottawa: The Estate Collection: Pleasure and Solace,
works by Robert Lewis Griffiths.
This guide combines the knowledge gathered during my long career coordinating the Carleton
University Library exhibits program and my recent sabbatical research on exhibits and events in
academic libraries. Between 1983, when I was hired as Exhibits Librarian at Carleton University
Library, and 2002, when the Library had little space available for exhibits and I became Head of
Access Services, I was responsible for running the Library’s exhibits program. After the latest
renovation to MacOdrum Library was completed in the Fall of 2013 and included dedicated
space for exhibits, I was once again asked to coordinate and produce exhibits for the Library.
During my 2014/2015 sabbatical I investigated the current state of exhibits and events in
academic libraries through literature and Web searches and site visits to a number of universities.
The end result is this guide, which I hope is both practical and inspirational.
“The first volume of Documents on Canadian External Relations was published in 1967, as Canada celebrated its first century of nationhood. Since then, volumes in this series have dealt with various periods in the history of Canadian foreign policy, from the Laurier era up to the Pearson years in government. The series currently includes 29 regular volumes and has reprinted over 20,000 documents, totalling almost 40,000 pages of text, making it the largest historical documentary project in Canada. The subject of this special volume, the Arctic, has an ever-growing importance for Canada as we approach our federation's 150th anniversary. This volume illuminates how and why Canada asserted its sovereignty over the Far North between 1874 and 1949, and it demonstrates how much Canadians today owe to the nation builders of the past”--Preface, p. [vi].
This paper discusses reverse engineering source code to produce UML sequence diagrams, with the aim to aid program comprehension and other software life cycle activities (e.g.,
verification). As a first step we produce scenario diagrams using the UML sequence diagram notation. We build on previous work, now combining static and dynamic analyses of a Java software, our objective being to obtain a lightweight instrumentation and therefore disturb the software behaviour as little as possible. We extract the control flow graph from the software source code and obtain an execution trace by instrumenting and running the software. Control flow and trace information is represented as models and UML scenario diagram generation becomes a model transformation problem. Our validation shows that we indeed reduce the execution overhead inherent to dynamic analysis, without losing in terms of the quality of the reverse-engineered information, and therefore in terms of the usefulness of the approach (e.g., for program comprehension).
Every year, for over three decades, Carleton University in Ottawa, Ontario has participated with other local educational institutions in providing a week-long instruction program that introduces young students to higher education. Highly motivated participants in grades 8 – 11 and numbering over 3,000 attend from several school boards in both eastern Ontario and western Quebec. The Enriched Mini Course Program has become an important recruitment tool for each institution, and at Carleton University, over 50 enriched mini courses are offered including one recent addition by the MacOdrum library staff.
In this article, the author recounts how leading an enriched mini course for millennials in the university library's new Discovery Centre is an innovative initiative that demonstrates the significance of the academic library in the local community, and how staff collaboration helps to develop team building and positive vibes with the millennials.
Design by Contract (DbC) is a software development methodology that focuses on clearly defining the interfaces between components to produce better quality object-oriented software. The idea behind DbC is that a method defines a contract stating the requirements a client needs to fulfill to use it, the precondition, and the properties it ensures after its execution, the postcondition. Though there exists ample support for DbC for sequential programs, applying DbC to concurrent programs presents several challenges. Using Java as the target programming language, this paper tackles such challenges by augmenting the Java Modelling Language (JML) and modifying the JML compiler to generate Runtime Assertion Checking (RAC) code to support DbC in concurrent programs. We applied our solution in a carefully designed case study on a highly concurrent industrial software system from the telecommunications domain to assess the effectiveness of contracts as test oracles in detecting and diagnosing functional faults in concurrent software. Based on these results, clear and objective requirements are defined for contracts to be effective test oracles for concurrent programs whilst balancing the effort to design them. Main results include that contracts of a realistic level of completeness and complexity can detect around 76% of faults and reduce the diagnosis effort for such faults by at least ten times. We, therefore, show that DbC can not only be applied to concurrent software but can also be a valuable tool to improve the economics of software engineering.
In this paper we propose a method and a tool to generate test suites from extended finite state machines, accounting for multiple (potentially conflicting) objectives. We aim at maximizing coverage and feasibility of a test suite while minimizing similarity between its test cases and minimizing overall cost. Therefore, we define a multi-objective genetic algorithm that searches for optimal test suites based on four objective functions. In doing so, we create an entire test suite at once as opposed to test cases one at a time. Our approach is evaluated on two different case studies, showing interesting initial results.
For functional testing based on the input domain of a functionality, parameters and their values are identified and a test suite is generated using a criterion exercising combinations of those parameters and values. Since software systems are large, resulting in large numbers of parameters and values, a technique based on combinatorics called Combinatorial Testing (CT) is used to automate the process of creating those combinations. CT is typically performed with the help of combinatorial objects called Covering Arrays. The goal of the present work is to determine available algorithms/tools for generating a combinatorial test suite. We tried to be as complete as possible by using a precise protocol for selecting papers describing those algorithms/tools. The 75 algorithms/tools we identified are then categorized on the basis of different comparison criteria, including: the test suite generation technique, the support for selection (combination) criteria, mixed covering array, the strength of coverage, and the support for constraints between parameters. Results can be of interest to researchers or software companies who are looking for a CT algorithm/tool suitable for their needs.
UML diagrams describe different views of one piece of software. These diagrams strongly depend on each other and must therefore be consistent with one another, since inconsistencies between diagrams may be a source of faults during software development activities that rely on these diagrams. It is therefore paramount that consistency rules be defined and that inconsistencies be detected, analyzed and fixed. The relevant literature shows that authors typically define their own UML consistency rules, sometimes defining the same rules and sometimes defining rules that are already in the UML standard. The reason might be that no consolidated set of rules that are deemed relevant by authors can be found to date. The aim of our research is to provide a consolidated set of UML consistency rules and obtain a detailed overview of the current research in this area. We therefore followed a systematic procedure in order to collect and analyze UML consistency rules. We then consolidated a set of 116 UML consistency rules (avoiding redundant definitions or definitions already in the UML standard) that can be used as an important reference for UML-based software development activities, for teaching UML-based software development, and for further research.
Reverse-engineering object interactions from source
code can be done through static, dynamic, or hybrid (static plus
dynamic) analyses. In the latter two, monitoring a program and
collecting runtime information translates into some overhead
during program execution. Depending on the type of application,
the imposed overhead can reduce the precision and accuracy of
the reverse-engineered object interactions (the larger the overhead
the less precise or accurate the reverse-engineered interactions),
to such an extent that the reverse-engineered interactions
may not be correct, especially when reverse-engineering a multithreaded
software system. One is therefore seeking an instrumentation
strategy as less intrusive as possible. In our past work, we
showed that a hybrid approach is one step towards such a solution,
compared to a purely dynamic approach, and that there is
room for improvements. In this paper, we uncover, in a systematic
way, other aspects of the dynamic analysis that can be improved
to further reduce runtime overhead, and study alternative
solutions. Our experiments show effective overhead reduction
thanks to a modified procedure to collect runtime information.
The practitioner interested in reducing software verification effort may found herself lost in the many alternative definitions of Graphical User Interface (GUI) testing that exist and their relation to the notion of system testing. One result of these many definitions is that one may end up testi
ng twice the same parts of the Software Under Test (SUT), specifically the application logic code. To clarify two important testing activities
for the avoidance of duplicate testing effort, this paper studies possible differences between GUI testing and system testing experimentally. Specifically, we selected a SUT equipped with system tests that directly exercise the application code; We used GUITAR, a well-known GUI testing software to GUI test this SUT. Experimental results show important differences between system testing and GUI testing in terms of structural coverage and test cost.
Context: The Unified Modeling Language (UML), with its 14
different diagram types, is the de-facto standard tool for objectoriented
modeling and documentation. Since the various UML
diagrams describe different aspects of one, and only one, software
under development, they are not independent but strongly depend
on each other in many ways. In other words, the UML diagrams
describing a software must be consistent. Inconsistencies between
these diagrams may be a source of the considerable increase of
faults in software systems. It is therefore paramount that these
inconsistencies be detected, analyzed and hopefully fixed.
Objective:
The aim of this article is to deliver a comprehensive
summary of UML consistency rules as they are described in the
literature to date to obtain an extensive and detailed overview of
the current research in this area.
Method:
We performed a Systematic Mapping Study by
following well-known guidelines. We selected 94 primary studies
from a search with seven search engines performed in December
2012.
Results:
Different results are worth mentioning. First it appears
that researchers tend to discuss very similar consistency rules,
over and over again. Most rules are horizontal (98.07%) and
syntactic (88.03%). The most used diagrams are the class diagram
(71.28%), the state machine diagram (42.55%) and the sequence
diagram (47.87%).
Conclusion:
The fact that many rules are duplicated in primary
studies confirms the need for a well accepted list of consistency
rules. This paper is a first step in this direction. Results indicate
that much more work is needed to develop consistency rules for
all 14 UML diagrams, in all dimensions of consistency (e.g.,
semantic and syntactic on the one hand, horizontal, vertical and
evolution on the other hand).
Model-based testing (MBT) is about testing a software system by using a model of its behaviour. To benefit fully from MBT, automation support is required. This paper presents a systematic review of prominent MBT tool support where we focus on tools that rely on state-based models. The systematic review protocol precisely describes the scope of the search and the steps involved in tool selection. Precisely defined criteria are used to compare
selected tools and comprise support for test coverage criteria, level of automation for various testing activities, and support for the construction of test scaffolding. The results of this review should be of interest to a wide range of stakeholders: software companies interested in selecting the most appropriate MBT tool for their needs; organizations willing to invest into creating MBT tool support; researchers interested in setting research directions.
There have been a number of steganography embedding techniques proposed over the past few years. In turn, there has been great interest in steganalysis techniques as the embedding techniques improve. Specifically, universal steganalysis techniques have become more attractive since they work independently of the embedding technique. In this work, we examine the effectiveness of a basic universal technique that relies on some knowledge about the cover media, but not the embedding technique. We consider images as a cover media, and examine how a single technique that we call steganographic sanitization performs on 26 different steganography programs that are publicly available on the Internet. Our experiments are completed using a number of secret messages and a variety of different levels of sanitization. However, since our intent is to remove covert communication, and not authentication information, we examine how well the sanitization process preserves authentication information such as watermarks and digital fingerprints.
Context:
The Unified Modeling Language (UML), with its 14 different diagram types, is the de-facto standard modeling language for object-oriented modeling and documentation. Since
the various UML diagrams describe different aspects of one, and only one, software under
development, they are not independent but strongly depend on each ot her in many ways.
In other words, diagrams must remain consistent. Dependencies between diagrams can become so intricate that it is sometimes even possible to synthesize one diagram on the basis of others. Support for synthesizing one UML diagram from other diagrams can provide the designer with significant help, thus speeding up the design process, decreasing the risk of errors, and guaranteeing consistency among the diagrams.
Objective:
The aim of this article is to provide a comprehensive summary of UML synthesis techniques as they have been described in literature to date in order to obtain an extensive and
detailed overview of the current research in this area.
Method:
We have performed a Systematic Mapping Study by following well-known guide-lines. We selected ten primary studies
by means of a search with seven search engines per-formed on October 2, 2013.
Results:
Various results are worth mentioning. First it appears that researchers have not frequently published papers concerning UML synthesis techniques since 2004 (with the exception
of two papers published in 2010). Only half of the UML diagram types are involved in the synthesis techniques we discovered. The UML diagram type most frequently
used as the source for synthesizing another diagram is the sequence diagram (66.7%), and the most synthesized diagrams are the state machine diagram (58.3%) and the class diagram (25%).
Conclusion:
The fact that we did not
obtain
a large
number
of primary stud
ies
over a 14 year
period
(only ten papers) indicates that
synthesizing
a UML diagram
from other UML diagrams
is not a
particularly
active line of research.
Research on UML diagram synthesis is nevertheless
relevant since synthesis techniques rely
on
or en
force diagram consistency
,
and
studying UML
diagram consistency
is an active line of research.
Another
r
esult is that
research
is
need
ed
to
investigate synthesis techniques for other types of UML diagrams
than
those involved in our primary studies.
In this work we discuss our efforts to use the ubiquity of smart phone systems and the mobility they provide to stream historical information about your current place on the earth to the end user. We propose the concept of timescapes to portray this historical significance of where they are standing and allow a brief travel through time. By combining GPS location, with a rich media interpretation of existing historical documents, historical facts become an on-demand resource available to travellers, school children, historians and any interested 3rd party. To our knowledge this is the first introduction of the term timescape to be used in the context of historical information pull. Copyright
We test for the presence of time-varying parameters (TVP) in the long-run dynamics of energy prices for oil, natural gas and coal, within a standard class of mean-reverting models. We also propose residual-based diagnostic tests and examine out-of-sample forecasts. In-sample LR tests support the TVP model for coal and gas but not for oil, though companion diagnostics suggest that the model is too restrictive to conclusively fit the data. Out-of-sample analysis suggests a random-walk specification for oil price, and TVP models for both real-time forecasting in the case of gas and long-run forecasting in the case of coal.
We address the problem of discovering routes in strongly connected planar geometric networks with directed links. Motivated by the necessity for establishing communication in wireless ad hoc networks in which the only information available to a vertex is its immediate neighborhood, we are considering routing algorithms that use the neighborhood information of a vertex for routing with constant memory only. We solve the problem for three types of directed planar geometric networks: Eulerian (in which every vertex has the same number of incoming and outgoing edges), Outerplanar (in which a single face contains all vertices of the network), and Strongly Face Connected, a new class of geometric networks that we define in the article, consisting of several faces, each face being a strongly connected outerplanar graph.
This article draws on Margaret Radin's theorization of 'contested commodities' to explore the process whereby informal housing becomes formalized while also being shaped by legal regulation. In seeking to move once-informal housing into the domain of official legality, cities can seldom rely on a simple legal framework of private-law principles of property and contract. Instead, they face complex trade-offs between providing basic needs and affordability and meeting public-law norms around living standards, traditional neighbourhood feel and the environment. This article highlights these issues through an examination of the uneven process of legal formalization of basement apartments in Vancouver, Canada. We chose a lengthy period-from 1928 to 2009-to explore how basement apartments became a vital source of housing often at odds with city planning that has long favoured a low-density residential built form. We suggest that Radin's theoretical account makes it possible to link legalization and official market construction with two questions: whether to permit commodification and how to permit commodification. Real-world commodification processes-including legal sanction-reflect hybridization, pragmatic decision making and regulatory compromise. The resolution of questions concerning how to legalize commodification are also intertwined with processes of market expansion.
The new renewable fuels standard (RFS 2) aims to distinguish corn-ethanol that achieves a 20% reduction in greenhouse gas (GHG) emissions compared with gasoline. Field data from Kim et al. (2009) and from our own study suggest that geographic variability in the GHG emissions arising from corn production casts considerable doubt on the approach used in the RFS 2 to measure compliance with the 20% target. If regulators wish to require compliance of fuels with specific GHG emission reduction thresholds, then data from growing biomass should be disaggregated to a level that captures the level of variability in grain corn production and the application of life cycle assessment to biofuels should be modified to capture this variability.
One hundred and ten English-speaking children schooled in French were followed from kindergarten to Grade 2 (Mage: T1 = 5;6, T2 = 6;4, T3 = 6;11, T4 = 7;11). The findings provided strong support for the Home Literacy Model (Sénéchal & LeFevre, 2002) because in this sample the home language was independent of the language of instruction. The informal literacy environment at home predicted growth in English receptive vocabulary from kindergarten to Grade 1, whereas parent reports of the formal literacy environment in kindergarten predicted growth in children's English early literacy between kindergarten and Grade 1 and growth in English word reading during Grade 1. Furthermore, 76% of parents adjusted their formal literacy practices according to the reading performance of their child, in support of the presence of a responsive home literacy curriculum among middle-class parents.
In this special issue of Nova Religio four historians of medieval and early modern Christianities offer perspectives on basic conceptual frameworks widely employed in new religions studies, including modernization and secularization, radicalism/violent radicalization, and diversity/diversification. Together with a response essay by J. Gordon Melton, these articles suggest strong possibilities for renewed and ongoing conversation between scholars of "old" and "new" religions. Unlike some early discussions, ours is not aimed simply at questioning the distinction between old and new religions itself. Rather, we think such conversation between scholarly fields holds the prospect of productive scholarly surprise and perspectival shifts, especially via the disciplinary practice of historiographical criticism.
We show that the tilted-grating-assisted excitation of surface plasmon polaritons on gold coated single-mode optical fibers depends strongly on the state of polarization of the core-guided light, even in fibers with cylindrical symmetry. Rotating the linear polarization of the guided light by 90° relative to the grating tilt plane is sufficient to turn the plasmon resonances on and off with more than 17 dB of extinction ratio. By monitoring the amplitude changes of selected individual cladding mode resonances we identify what we believe to be a new refractive index measurement method that is shown to be accurate to better than 5 × 10-5.
The goal of the present intervention research was to test whether guided invented spelling would
facilitate entry into reading for at-risk kindergarten children. The 56 participating children had poor
phoneme awareness, and as such, were at risk of having difficulty acquiring reading skills. Children
were randomly assigned to one of three training conditions: invented spelling, phoneme
segmentation, or storybook reading. All children participated in 16 small group sessions over eight
weeks. In addition, children in the three training conditions received letter-knowledge training and
worked on the same 40 stimulus words that were created from an array of 14 letters. The findings
were clear: on pretest, there were no differences between the three conditions on measures of early
literacy and vocabulary, but, after training, invented spelling children learned to read more words
than did the other children. As expected, the phoneme-segmentation and invented-spelling children
were better on phoneme awareness than were the storybook-reading children. Most interesting,
however, both the invented spelling and the phoneme-segmentation children performed similarly on
phoneme awareness suggesting that the differential effect on learning to read was not due to
phoneme awareness per se. As such, the findings support the view that invented spelling is an
exploratory process that involves the integration of phoneme and orthographic representations. With
guidance and developmentally appropriate feedback, invented spelling provides a milieu for children
to explore the relation between oral language and written symbols that can facilitate their entry in
reading.