We report on the fabrication of a chirped, phase mask that was used to create a fiber Bragg grating(FBG)device for the compensation of chromatic dispersion in longhaul optical transmission networks.Electron beamlithography was used to expose the grating onto a resist-coated quartz plate. After etching, this phase mask was used to holographically expose an index grating into the fiber core [K. O. Hill, F. Bilodeau, D. C. Johnson, and J. Albert, Appl. Phys. Lett.62, 1035 (1993)]. The linear increase in the grating period, “chirp,” is only 0.55 nm over the 10 cm grating. This is too small to be defined by computer aided design and a digital deflection system. Instead, the chirp was incorporated by repeatedly rescaling the analog electronics used for field size calibration. Special attention must be paid to minimize any field stitching and exposure artifacts. This was done by using overlapping fields in a “voting” method. As a result, each grating line is exposed by the accumulation of three overlapping exposures at 1/3 dose. This translates any abrupt stitching error into a small but uniform change in the line-to-space ratio of the grating. The phase mask was used with the double-exposure photoprinting technique [K. O. Hill, F. Bilodeau, B. Malo, T. Kitagawa, S. Thériault, D. C. Johnson, J. Albert, and K. Takiguchi, Opt. Lett. 19, 1314 (1994)]: a KrF excimer laser holographically imprints an apodized chirped Bragg grating in a hydrogen loaded SMF-28 optical fiber. Our experiments have demonstrated a spectral delay of −1311 ps/nm with a linearity of +/−10 ps over the 3 dB bandwidth of the resonant wavelength of the FBG. The reflectance, centered on 1550 nm, shows a side-lobe suppression of −25 dB. Fabrication processes and optical characterization will be discussed.
Part of a series from the CMCRP. Visit the CMCRP website for project details and background: http://www.cmcrp.org
Every year the Canadian Media Concentration Research Project puts out two reports on the state of the telecoms, internet, and media industries in Canada. This is the second installment in this year’s series. Whereas the first report in this series examines the growth, development and upheaval that are transforming the media industries in Canada, this report takes a step further by asking a deceptively simple but profoundly important question: have these industries—individually and collectively—become more or less concentrated over time? The report does so by examining the state of competition and concentration in the mobile wireless and wireline telecoms market, broadband internet access, cable, satellite & IPTV services, broadcast television and radio, specialty and pay television services, online video subscription and download services, newspapers, magazines, internet advertising, search engines, social media as well as mobile and desktop operating systems and browsers. This year’s report also adds significantly to our efforts last year to examine the dynamics of advertising spending across all media in Canada, i.e. TV, radio, online, newspapers, magazines and out-of-doors. As we noted in our first report, we have also significantly expanded our coverage by taking some preliminary steps to capture a broader range of audiovisual media services that are delivered over the internet.
Rural and remote communities comprise around32% and 22% of Australia’s and Canada’s population. However, only 14% and 16% of family physicians in Australia and Canada, respectively, practice in these communities, resulting in a disproportion in access as compared with urban areas. An erosion of health services occurs when the number of physicians and other health care providers in a region is insufficient or these professionals are non-existent. Even when existing in a rural and remote region, providers are often overburdened. Inaccessibility to services in rural and remote communities’ results in poor health outcomes for all involved.
In Canada, 1 in 7 physicians will leave rural practice within two years. Strategies to address these turnover rates and the lessening interest in entering rural practice have focused on supporting recruitment and retention initiatives (RnR) to first bring physicians into rural practice and then encourage physicians to continue in rural practice beyond the short-term.
These programs have so far been insufficient or ineffective to address the lack of physicians in rural and remote areas. A review of recent literature related to RnR initiatives focused on rural physicians in Australia and Canada was conducted to investigate the strengths and limitations of initiatives. Further, this review critically examines the short and long-term feasibility of initiatives and develops a conceptual framework for designing or examining RnR initiatives.
UML diagrams describe different views of one piece of software. These diagrams strongly depend on each other and must therefore be consistent with one another, since inconsistencies between diagrams may be a source of faults during software development activities that rely on these diagrams. It is therefore paramount that consistency rules be defined and that inconsistencies be detected, analyzed and fixed. The relevant literature shows that authors typically define their own UML consistency rules, sometimes defining the same rules and sometimes defining rules that are already in the UML standard. The reason might be that no consolidated set of rules that are deemed relevant by authors can be found to date. The aim of our research is to provide a consolidated set of UML consistency rules and obtain a detailed overview of the current research in this area. We therefore followed a systematic procedure in order to collect and analyze UML consistency rules. We then consolidated a set of 116 UML consistency rules (avoiding redundant definitions or definitions already in the UML standard) that can be used as an important reference for UML-based software development activities, for teaching UML-based software development, and for further research.
This article draws on Margaret Radin's theorization of 'contested commodities' to explore the process whereby informal housing becomes formalized while also being shaped by legal regulation. In seeking to move once-informal housing into the domain of official legality, cities can seldom rely on a simple legal framework of private-law principles of property and contract. Instead, they face complex trade-offs between providing basic needs and affordability and meeting public-law norms around living standards, traditional neighbourhood feel and the environment. This article highlights these issues through an examination of the uneven process of legal formalization of basement apartments in Vancouver, Canada. We chose a lengthy period-from 1928 to 2009-to explore how basement apartments became a vital source of housing often at odds with city planning that has long favoured a low-density residential built form. We suggest that Radin's theoretical account makes it possible to link legalization and official market construction with two questions: whether to permit commodification and how to permit commodification. Real-world commodification processes-including legal sanction-reflect hybridization, pragmatic decision making and regulatory compromise. The resolution of questions concerning how to legalize commodification are also intertwined with processes of market expansion.
My study attempted to find out if the old part of our brain (limbic system) had a
significant role in influencing how we detect the valence of blurry words without
conscious awareness of what the words are. 10 participants were shown blurry words that
could not be read and were asked to guess valence, without a time limit. The hypotheses
for this study was that participants would be accurate in detecting valence of blurred
words and that participants would rate negative words the most accurately. I also
predicted that participants would attempt to read words before rating valence and they
would attempt to read the words only in the beginning. The stimuli were shown to the
participants on printed-paper. There were 10 blurred words per page with accompanying
5-point Likert scales by each blurred word with a reference scale at the top of every page.
My research data found that there was a significant statistical difference between people’s
ability to detect the valence of blurred words compared to the normal ability (which is
100% accuracy). The comparison showed that the participants were significantly worse at
detecting the valence of blurred words than unblurred words. There was no significant
statistical difference between people’s ability to detect the valence of blurry neutral
words compared to the valence of blurry nonsensical words. Participants were equally
accurate at both of these word-types. Participant responses also showed that they were
statistically better at detecting the valence of negative blurry words than positive blurry
words. So they were better at detecting negative valence than those of other valences.
The paper identifies the characteristics of firm activities that constitute its technology scanning dynamic capability, which enables the firm to translate information about customer needs into information about tangible ways to introduce new products and services to satisfy those needs. The ability to find a specific actionable way to address customer needs is proposed to be measured by a latent construct called technology scanning. Using the literature on marketing, innovation management, knowledge management, new product development, and economics, five dimensions are identified for a technology scanning scale. A strong presence of 'technology scanning' ensures that the firm's resources are targeted to find the solution of the problems that matters most, the ones that were identified as a consequence of high level of market orientation of the firm. This work would shed some light on how managers might solve the problems and needs of the customers identified through market orientation practices. When market orientation guides technology scanning activities, the outcomes are more desirable to the firm.
More about the Centre for Studies on Poverty and Social Citizenship: https://carleton.ca/cspsc
See also: Canada's First National Housing Strategy - A Panel Discussion focusing on Canada’s first National Housing Strategy at the CASWE National Conference 2018
In 2016, with funding from the Ontario Trillium Foundation’s Seed Grant program, The Somali
Centre for Family Services of Ottawa (SCFS) invited Carleton University’s Centre for Studies on Poverty
and Social Citizenship (CSPSC) to partner on the completion of a needs assessment focusing on the
barriers faced by Somali youth in accessing post-secondary education, and employment training and
opportunities. In carrying out this research, the SCFS’s main objective was to address social and
economic exclusion locally by inviting Somali youth (age 19-30) from the Ottawa area to engage in the
conceptualization and design of resources that could best support their participation in educational and
We consider a problem which can greatly enhance the areas of cursive script recognition and the recognition of printed character sequences. This problem involves recognizing words/strings by processing their noisy subsequences. Let X* be any unknown word from a finite dictionary H. Let U be any arbitrary subsequence of X*. We study the problem of estimating X* by processing Y, a noisy version of U. Y contains substitution, insertion, deletion and generalized transposition errors — the latter occurring when transposed characters are themselves subsequently substituted. We solve the noisy subsequence recognition problem by defining and using the constrained edit distance between X ε H and Y subject to any arbitrary edit constraint involving the number and type of edit operations to be performed. An algorithm to compute this constrained edit distance has been presented. Using these algorithms we present a syntactic Pattern Recognition (PR) scheme which corrects noisy text containing all these types of errors. Experimental results which involve strings of lengths between 40 and 80 with an average of 30.24 deleted characters and an overall average noise of 68.69 % demonstrate the superiority of our system over existing methods.
This paper deals with the relatively new field of sequencebased estimation which involves utilizing both the information in the observations and in their sequence of appearance. Our intention is to obtain Maximum Likelihood estimates by “extracting” the information contained in the observations when perceived as a sequence rather than as a set. The results of  introduced the concepts of Sequence Based Estimation (SBE) for the Binomial distribution. This current paper generalizes these results for the multinomial “two-at-a-time” scenario. We invoke a novel phenomenon called “Occlusion” that can be described as follows: By “concealing” certain observations, we map the estimation problem onto a lower-dimensional binomial space. Once these occluded SBEs have been computed, we demonstrate how the overall Multinomial SBE (MSBE) can be obtained by mapping several lower-dimensional estimates onto the original higher-dimensional space. We formally prove and experimentally demonstrate the convergence of the corresponding estimates.