SEARCH CONTENT

You are looking at 1 - 100 of 8,995 items :

  • Physics, other x
Clear All
Sammlung im Zwinger, Staatliche Kunstsammlungen Dresden
Our Myth of Measured Time

Abstract

In this work, we propose a spatio-temporal Markovian-like model for ordinal observations to predict in time the spread of disease in a discrete rectangular grid of plants. This model is constructed from a logistic distribution and some simple assumptions that reflect the conditions present in a series of studies carried out to understand the dissemination of a particular infection in plants. After constructing the model, we establish conditions for the existence and uniqueness of the maximum likelihood estimator (MLE) of the model parameters. In addition, we show that, under further restrictions based on Partially Ordered Markov Models (POMMs), the MLE of the model is consistent and normally asymptotic. We then employ the MLE’s asymptotic normality to propose methods for testing spatio-temporal and spatial dependencies. The model is estimated from the real data on plants that inspired the model, and we used its results to construct prediction maps to better understand the transmission of plant illness in time and space.

Abstract

Chiral molecules hold a mail position in Organic and Biological Chemistry, so pharmaceutical industry needs suitable strategies for drug synthesis. Moreover, Green Chemistry procedures are increasingly required in order to avoid environment deterioration. Catalytic synthesis, in particular organocatalysis, in thus a continuously expanding field. A survey of more recent researches involving chiral imidazolidinones is here presented, with a particular focus on immobilized catalytic systems.

Abstract

Constant exposure to traffic noise pollution can have significant impact on human health and well being. Occupants of high-rise buildings along noisy traffic arteries are severely affected. In an attempt to contribute to noise protection design of prospective high-rise buildings, traffic noise measurements and prediction using the CRTN (calculation of road traffic noise) model, were made along the façade of a high-rise building in central Athens. The aim was to test the accuracy of this model in predicting the vertical distribution (mapping) of traffic noise along such building façades, under the local urban characteristics of the Mediterranean capital. The predicted and measured noise levels were found to be highly coherent with each other, and their vertical distribution pattern, by and large, confirmed findings from earlier studies. Nevertheless, the predicted values had a tendency of underestimation, with a mean difference −2.2 dB(A) with reference to measured values. It is considered that this underestimation is associated mainly with a newly proposed feature of urban morphology, namely (local) geo-morphology. By and large, it can be inferred that the CRTN model is a useful tool, suitable for the prediction of traffic noise along high-rise building façades during their planning and design stage. The results represent a further step towards more general application of this model, as well as a contribution to the use of this model considering a wider number of urban features.

Abstract

In this study, some measurements like the current, voltage and hydrogen flow based on the fuel cell are investigated in spectral-domain as well as their time-domain representations and then, their spectral properties are extracted. Besides this, taking the simplified transfer function approach into account, which is defined between the hydrogen flow and current of the cell as an input-output pair, more detailed results are obtained. Therefore, the spectral parts of the fuel cell are put into categories under the impacts coming from the process, measurement circuits and digitizers. The process noise to be defined at very small frequencies (<15 Hz) can be explained as the effects of the various physical and chemical interactions emerging in the fuel cell. Moreover, this study analysed the spectral characteristics of fuel cells for current, voltage and hydrogen flow in detail.

Abstract

This study compares metrics for environmental noise diagnosis in schools at airport vicinity. The goal is to analyze and identify the most suitable criteria for scaling aircraft noise impact over schools, during landing and take-off operations. A Brazilian case study is conducted, based on the noise mapping and sound level verification. The day-night average noise level (DNL) and the time above limit (TA) are investigated using acoustic simulation and noise mapping and in order to identify the critical receivers. Results of DNL and TA for two schools at airport surroundings show that the criteria adopted by the municipal and airport authorities to describe the airport noise are unsatisfactory and do not reflect the intermittent behavior of this type of noise. It was verified that individual receiver analysis, based on noise interruptions thought TA parameter is more suitable for evaluation of noise impact over schools at airport vicinity.

Abstract

Cyanine dyes are characterized by an odd number 2n + 3 of π-centers and 2n + 4 π-electrons (where n is the number of vinyl groups –CH = CH–). This special feature has a marked impact on their electronic structure and thus their equilibrium structure in the electronic ground state as well their color and electronic spectrum, respectively. Their first technical application was the use as spectral sensitizersspectral sensitizers"?> in silver halide photography. Today they have numerous of applications in digital optical data storage, Computer-to-Plate lithographic printing plates, bio-analysisbio-analysis"?> and medical diagnosticsmedical diagnostics"?>.

Abstract

For more than 50 years the Mean Measure of Divergence (MMD) has been one of the most prominent tools used in anthropology for the study of non-metric traits. However, one of the problems, in anthropology including palaeoanthropology (more often there), is the lack of big enough samples or the existence of samples without sufficiently measured traits. Since 1969, with the advent of bootstrapping techniques, this issue has been tackled successfully in many different ways. Here, we present a parametric bootstrap technique based on the fact that the transformed θ, obtained from the Anscombe transformation to stabilize the variance, nearly follows a normal distribution with standard deviation σ=1/N+1/2, where N is the size of the measured trait. When the probabilistic distribution is known, parametric procedures offer more powerful results than non-parametric ones. We profit from knowing the probabilistic distribution of θ to develop a parametric bootstrapping method. We explain it carefully with mathematical support. We give examples, both with artificial data and with real ones. Our results show that this parametric bootstrap procedure is a powerful tool to study samples with scarcity of data.

Abstract

Missing exposure information is a very common feature of many observational studies. Here we study identifiability and efficient estimation of causal effects on vector outcomes, in such cases where treatment is unconfounded but partially missing. We consider a missing at random setting where missingness in treatment can depend not only on complex covariates, but also on post-treatment outcomes. We give a new identifying expression for average treatment effects in this setting, along with the efficient influence function for this parameter in a nonparametric model, which yields a nonparametric efficiency bound. We use this latter result to construct nonparametric estimators that are less sensitive to the curse of dimensionality than usual, e. g. by having faster rates of convergence than the complex nuisance estimators they rely on. Further we show that these estimators can be root-n consistent and asymptotically normal under weak nonparametric conditions, even when constructed using flexible machine learning. Finally we apply these results to the problem of causal inference with a partially missing instrumental variable.

Abstract

4,4′-(4,4′-isopropylidenediphenoxy)bis(phthalic anhydride) (BPADA) was reacted with three structurally different diamines to produce poly(amic acid)s, which were then imidized to produce colorless and transparent polyimide (CPI) films through stepwise thermal cyclization. The three amines used to synthesize CPI based on BPADA are: bis(3-aminophenyl)sulfone (APS), p-xylyenediamine (p-XDA), and bis[4-(3-aminophenoxy)-phenyl] sulfone (m-BAPS). The obtained CPI films were almost colorless and exhibited excellent optical transparencies. The solubility of the CPI films in various solvents was investigated, and all the CPI films were found to be soluble in common solvents such as chloroform, dichloromethane, N,N’-dimethyl acetamide, and pyridine. The thermo-optical properties and oxygen transmission rates (O2TRs) of the CPI films were examined for various biaxial stretching ratios in the range of 100–150%, and their properties were compared. When the stretching ratio changed from 100 to 150%, the glass transition temperature and yellow index did not show any significant change; however, the O2TR decreased for all CPI films.

Abstract

The X-ray Absorption Fine Structure (XAFS) with its subregions X-ray Absorption Near-edge Structure (XANES) and Extended X-ray Absorption Fine Structure (EXAFS) is a powerful tool for the structural analysis of materials, which is nowadays a standard component of research strategies in many fields. This review covers a wide range of topics related to its measurement and use: the origin of the fine structure, its analytical potential, derived from the physical basis, the environment for measuring XAFS at synchrotrons, including different measurement geometries, detection modes, and sample environments, e. g. for in-situ in-situ"?> and operando operando"?> work, the principles of data reduction, analysis, and interpretation, and a perspective on new methods for structure analysis combining X-ray absorption with X-ray emission. Examples for the application of XAFS have been selected from work with heterogeneous catalysts with the intention to demonstrate the strength of the method providing structural information about highly disperse and disordered systems, to illustrate pitfalls in the interpretation of results (e. g. by neglecting the averaged character of the information obtained) and to show how its merits can be further enhanced by combination with other methods of structural analysis and/or spectroscopy.

Abstract

The neutron powder diffraction technique has been used for structural studies of Rb2UBr6 solid electrolyte as a function of temperature. The low-, room-, and high-temperature structures have been determined. At the temperature range of 4.2–80 K, the compound crystallizes in a monoclinic unit cell in the P21/c space group. At 80 K and 853 K, the compound crystallizes in a tetragonal unit cell in the P4/mnc space group. At 300 K, the lattice constants are a = b = 7.745(1) and c = 11.064(1) Å. At the temperature range of 853–960 K, a trigonal phase is observed in the Pʒ̄ml space group.

Abstract

Research done on a set of simple fluidic (with the fluid used as the ionized medium being air under atmospheric pressure) alphavoltaic cells – small ionizing reactors or “nuclear batteries”, designed in the Faculty of Power and Aerospace Engineering of Warsaw University of Technology, Poland – has shown the possibility of accumulation of usable amount of electric charge. Two simple methods are proposed to describe the fluidic alphavoltaic cells in terms of their efficiency. The results of these methods are presented and compared with the efficiencies of other contemporary types of solid-body (semiconductor junction-based) alpha- and betavoltaic cells. The comparison showed that despite the far-reaching simplicity in design, the designed fluidic cells are still more efficient than some of the solid-body devices that use the alpha type of decay.

Abstract

Background and objectives: This study describes the treatment planning and dose delivery methods of radiotherapy for patients undergoing bone marrow transplantation. The analysis was carried out in the context of the evolution of these methods over the last 60 years.

Materials and methods: A systematic literature search was carried out using the PubMed search engine. Overall, 90 relevant studies were included: 24 general studies, 10 describing isotopes usage, 24 related to conventional and 32 to advanced methods.

Results: The analysis of the evolution of radiotherapy methods shows how significantly the precision of dose planning methods and its delivery have changed. The atypical positioning caused by geometrical requirements for applications of isotopes or conventional techniques has been replaced by positioning on a therapeutic couch, which allows a more precise setup of the patient that is necessary for an exact delivery of the planned dose. The dose can be fully optimized and calculated on tomographic images by algorithms implemented in planning systems. Optimization process allows to reduce doses in organs at risk. The accuracy between planned and delivered doses can be checked by pretreatment verification methods, and the patient positioning can be checked by image guidance procedures.

Interpretation and conclusions: Current radiotherapy solutions allow a precise delivery of doses to the planning target volume while reducing doses to organs at risk. Nevertheless, it should be kept in mind that establishing radiotherapy as an important element of the whole therapeutic regimen resulted from the follow-up of patients treated by conventional techniques. To confirm the clinical value of new advanced techniques, clinical trials are required.

Abstract

This study presented a self-designed prompt gamma neutron activation analysis (PGNAA) model and used Fluka simulation to simulate the heavy metals (Mn, Cu, Hg, Ni, Cr, Pb) in soil samples. The relationship between the prompt γ -ray yield of each heavy metal and soil thickness, content of heavy metals in the soil, and source distance was obtained. Simulation results show that the prompt γ -ray yield of each heavy metal increases with the increase in soil thickness and reaches saturation at 18 cm. The greater the proportion of heavy metals in the soil, the greater the prompt γ -ray yield. The highest content is approximately 3%, and the change in distance between the neutron source and soil sample does not affect the prompt γ -ray yield of heavy metals.

Abstract

The triggering of a “dirty bomb” generates a complex scenario, with enormous challenges for the responders due to initial misinformation and the urgency to act quickly yet effectively. Normally, the first 100 h are decisive for perceiving the risk in a more realistic dimension, but the support of methodologies that rely on computational simulations can be valuable when making key decisions. This work seeks to provide support for the early decision-making process by using a Gaussian model for the distribution of a quantity of Cs-137 spread by a radiological dispersive device (RDD). By sequentially joining two independent programs, HotSpot Health Physics codes and RESidual RADiation (RESRAD)-RDD family of codes, we came up with results that suggest a segmented approach to the potentially affected population. These results advocate that (a) the atmospheric stability conditions represented by the Pasquill–Gifford classes and (b) the population subgroups defined by radiation exposure conditions strongly influence the postdetonation radiological effects. These variables should be taken into account in the elaboration of flexible strategies that include many climatic conditions and to priori-tize attention to different groups of public at risk. During the initial phases of such an event, it is believed that simulations using Gaussian models may be of value in anticipating the possible changes in key variables during the decision-making process. These variables may severely affect the effectiveness of the actions of responders and the general public’s safety.

Abstract

The radon issue has been known worldwide for dozens of years. Many scientific (ICRP Publication No. 137), technical (ICRU Report No. 88), and legislative (Council Directive 2013/59/EURATOM (EU-BSS)) documents have been published in the last decade. More and more attention is being paid to precise quantification to determine the concentration and consequent effects of various pollutants on human health worldwide. The quality of measurement and the variety of measurement techniques increase the need to unify measurement procedures and metrology continuity. Countries around the world are beginning to unify metrological procedures for determining different quantities based on international recommendations and standards. Not only for these reasons, it became more actual a need for more accurate radon activity concentration measurement and radon metrology unification. This paper summarizes the main remarks and technical aspects to the historical development of radon metrology.

Abstract

We investigate theoretically the entanglement in a hybrid Fabry-Perot cavity system. A membrane in the cavity acts as a mechanical resonator, and a two-level quantum dot is coupled to both the cavity mode and the mechanical resonator. The entanglements between the cavity field and the mechanical resonator, between the mechanical resonator and the quantum dot, as well as between the cavity field and the quantum dot are observed. The logarithmic negativities in the first two subsystems are much larger than those in the system without two-way coupled quantum dot, and the entanglements are robust against the thermal temperature (entanglements still exist in tens of Kelvin). We also find that without direct coupling between the cavity field and the mechanical resonator, one can till observe effective entanglement between them in our system. Our work is helpful and may have potential applications in the research of multipartite entanglement in physical system.

Abstract

Survival analysis is a widely used method to establish a connection between a time to event outcome and a set of potential covariates. Accurately predicting the time of an event of interest is of primary importance in survival analysis. Many different algorithms have been proposed for survival prediction. However, for a given prediction problem it is rarely, if ever, possible to know in advance which algorithm will perform the best. In this paper we propose two algorithms for constructing super learners in survival data prediction where the individual algorithms are based on proportional hazards. A super learner is a flexible approach to statistical learning that finds the best weighted ensemble of the individual algorithms. Finding the optimal combination of the individual algorithms through minimizing cross-validated risk controls for over-fitting of the final ensemble learner. Candidate algorithms may range from a basic Cox model to tree-based machine learning algorithms, assuming all candidate algorithms are based on the proportional hazards framework. The ensemble weights are estimated by minimizing the cross-validated negative log partial likelihood. We compare the performance of the proposed super learners with existing models through extensive simulation studies. In all simulation scenarios, the proposed super learners are either the best fit or near the best fit. The performances of the newly proposed algorithms are also demonstrated with clinical data examples.

Abstract

We introduce a Bayesian framework for simultaneous feature selection and outlier detection in sparse high-dimensional regression models, with a focus on quantitative trait locus (QTL) mapping in experimental crosses. More specifically, we incorporate the robust mean shift outlier handling mechanism into the multiple QTL mapping regression model and apply LASSO regularization concurrently to the genetic effects and the mean-shift terms through the flexible extended Bayesian LASSO (EBL) prior structure, thereby combining QTL mapping and outlier detection into a single sparse model representation problem. The EBL priors on the mean-shift terms prevent outlying phenotypic values from distorting the genotype-phenotype association and allow their detection as cases with outstanding mean shift values following the LASSO shrinkage. Simulation results demonstrate the effectiveness of our new methodology at mapping QTLs in the presence of outlying phenotypic values and simultaneously identifying the potential outliers, while maintaining a comparable performance to the standard EBL on outlier-free data.

Abstract

Understanding the dynamic formation mechanism of online collective attention has been attracted diversified interests such as Internet memes, viral videos, or social media platforms and Web-based businesses, and has practical application in the area of marketing and advertising, propagation of information. Bulletin Board System, or BBS can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Clicks and replies of the posts quantify the degree of collective attention. For example, the collective clicking behavior of users on BBS gives rise to the up and down of focus on posts, and transporting attention between topics, the ratio between clicks and replies measure the heat degree of a post. We analyzed the dynamics of collective attention millions of users on an interactive Tianya Zatan BBS. By analyzing the dynamics of clicks we uncovered a non-trivial Hawkes process self-exciting regularity concerning the impact of novelty exponential decay mechanism. Here, it able to explain the empirical data of BBS remarkably well, such as popular topics are observed in time frequently cluster, asymptotic normality of clicks. Our findings indicate that collective attention among large populations decays with a exponential decaying law, suggest the existence of a natural time scale over novelty fades. Importantly, we show that self-exciting point processes can be used for the purpose of collective attention modeling.

Abstract

Electron propagation in a trapped state between an insulator and a metal during very close contact in a triboelectric nanogenerator (TENG) system was considered in this study. A single energy level (E 0) was assumed for the trap and wave function inside the trap, which is related to the ground state energy. The phase of the waveform in the metal (neglecting the rebound effect at the wall) was assumed very small (δ′ ≪ 1) because of the large size of the metal. The contact distance between the trap and metal is very small, which allows us to ignore the vacuum potential. Based on our results, the probability of finding an electron inside the trap as a function of time was found to be in oscillation (i.e., back-and-forth propagation of the electron between the trap and metal leads to an equilibrium state). These results can be used to understand the quantum mechanisms of continuous contact, particularly in sliding-mode TENG systems.

Abstract

Clustering as a fundamental unsupervised learning is considered an important method of data analysis, and K-means is demonstrably the most popular clustering algorithm. In this paper, we consider clustering on feature space to solve the low efficiency caused in the Big Data clustering by K-means. Different from the traditional methods, the algorithm guaranteed the consistency of the clustering accuracy before and after descending dimension, accelerated K-means when the clustering centeres and distance functions satisfy certain conditions, completely matched in the preprocessing step and clustering step, and improved the efficiency and accuracy. Experimental results have demonstrated the effectiveness of the proposed algorithm.

Abstract

A model for predator-prey interactions with herd behaviour is proposed. Novelty includes a smooth transition from individual behaviour (low number of prey) to herd behaviour (large number of prey). The model is analysed using standard stability and bifurcations techniques. We prove that the system undergoes a Hopf bifurcation as we vary the parameter that represents the efficiency of predators (dependent on the predation rate, for instance), giving rise to sustained oscillations in the system. The proposed model appears to possess more realistic features than the previous approaches while being also relatively easier to analyse and understand.