International Conference Cybernetic Modelling of Biological Systems MCSB 2015

Kraków (Poland), 14–15 May 2015


Irena Roterman-Konieczna (Kraków)

Ryszard Tadeusiewicz (Kraków)

Scientific Committee

P. Augustyniak (Kraków), E. Cydzik-Krasicka (Zielona Góra), M. Darowski (Warszawa), A. Gacek (Zabrze), E. Gorzelańczyk (Toruń), R. Gryglewski (Kraków), J. Kiwerski (Konstancin), M. Kłys (Kraków), J. Kulikowski (Warszawa), M. Kuchta (Warszawa), J. Leszek (Wrocław), P. Ładyżyński (Warszawa), R. Maniewski (Warszawa), A. Nowakowski (Gdańsk), M. Nowakowski (Kraków), K. Stąpor (Gliwice), A. Świerniak (Gliwice), Z. Tabor (Kraków), W. Torbicz (Warszawa), J. Waniewski (Warszawa), J. Wtorek (Gdańsk), E. Zalewska (Warszawa)

Organising Committee

K. Sałapa (UJ CM Kraków), P. Walecki (UJ CM Kraków), Z. Wiśniowski (UJ CM Kraków), M. Długosz (AGH Kraków)

What can teach us simulations at the molecular level for the folding of proteins

Chomilier J.1, Duprat E.1, Carpentier M.1, Papandreou N.2, Banach M.3, Roterman I.3

1Institute of Mineralogy, Materials Physics and Cosmochemistry, Université Pierre et Marie Curie, France

2Laboratory of Genetics, Agricultural University of Athens, Greece

3Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

Answering the issue of ‘Cybernetics Modelling of Biological Systems’ covers a large spectrum of dimensions, starting from the molecular level up to the population level. If one remains at the scale of molecules, historically, the first method is the so-called comparative modelling, aiming at producing the model of a protein from the structure of a homologous sequence. Nowadays, several services over the web are able to produce in a reasonable computer time a model of most of the query sequences, provided one sample of their architecture is already deposited in one of the structural databases. In the meantime, large improvements in the algorithms designed to decipher distant relations among sequences sharing one common ancestor have increased the quality of the annotations in the sequence databases, in particular because of their ability to transfer information from a related protein. Then the prediction of complexes came in the game, either concerning protein-protein interactions, or protein ligand, and more recently protein peptide docking. Another aspect concerns dynamics of proteins, with the increasing power of computers or the parallel use of huge numbers of CPUs, and the improvement of the algorithms. Although it is not a routine task, the folding of some small proteins has been achieved on dedicated hardware for trajectories of several milliseconds. Simulations can give access to the intermediate states a protein goes through during its folding process. Besides, analysis of dynamic processes allowed to give insights into thermodynamics, and more and more accurate free energy predictions are available, giving some view on the effects of a point mutation on the structure, and hence, on the function.

Protein folding simulations of ‘3D Gauss accordant’ structures

Tomanek M.1, Roterman I.2, Kalinowska B.3, Baster Z.3, Dulak D.3, Szepieniec T.1

1Academic Computer Center AGH Cyfronet, Poland

2Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

3Department of Medical Biotechnology, Jagiellonian University, Poland

A protein folding is a biological process aimed at creation of a fully functional protein starting from a linear chain of amino acids. Being able to predict a correct result of the mentioned process using in silico methods is very crucial for drug design, especially in an aspect of personal therapy. This abstract describes a concept, an implementation and exemplary results of the protein folding process simulator developed for PLGrid NG project. The tool uses two-step model which is composed of an Early and a Late stages [1]. A preparation of an initially formed 3D conformation with secondary structures drafts for a given amino acid sequence is an objective of the Early Stage. An output of this stage is optimized in the next step using an internal (electrostatic, torsional potential and van der Waals) and an external (influence of water surrounding) force field. Currently, the software is prepared for a so-called ‘3D Gauss accordant’ protein – it means that spatial positions of protein residues satisfy a 3D Gauss hydrophobicity model. The simulations for several of such structures were executed and some of the results revealed quite high similarity with the native structure (e.g. TM-score 5.3 and RMSD 3.0). Results of the simulations are promising but the process still need enhancements and should be tested on other proteins. In order to find a method to filter good results without knowledge about the native structure, relation between correctness of conformation prediction (RMSD, QCS, GDT and TM-score coefficients) and process/results parameters are searched.

[1] I. Roterman, L. Konieczny et al.: Two-intermediate model to characterize the structure of fast-folding proteins. J Theor Biol 283(1), 2011, pp. 60-70

Hydrophobic core structure of macromomycin - the apoprotein of the antitumour antibiotic auromomycin – fuzzy oil drop model applied

Roterman I.1, Banach M.1, Konieczny L.2

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Institute of Medical Biochemistry, Jagiellonian University Medical College, Poland

Fuzzy oil drop model [1,2] applied to analyse the structure of macromomycin, the apoprotein of the antitumour antibiotic auromomycin [3] revealed the differentiation of β-structural fragments present in β-sandwich. The seven-stranded antiparallel β-barrel and two antiparallel β-sheet ribbons represent highly ordered geometry of the structure. However participation in hydrophobic core formation appears different. The structure of the complete domain represents status of not-regular hydrophobic core. Some β-structural fragments appear to represent the hydrophobicity density distribution accordant with the idealized distribution of hydrophobicity as expected using fuzzy oil drop model. Four β-structural fragments generating one common layer appear to be unstable in respect to general structure of hydrophobic core. This area is expected to be more flexible than the other part of molecule. The influence of ligand binding (two 2-methyl-2,4-pentanediol) and two SS-bonds on the structure of hydrophobic core will be discussed.

[1] Banach M, Konieczny L, Roterman I. (2014) The fuzzy oil drp model, based on hydrophobicity density distribution, generalizes the influence of water environment on protein structure and function. J. Theor. Biol. 359, 6-17.

[2] Kalinowska B, Banach M, Konieczny L, Roterman I. (2015) Application of Divergence Entropy to Characterize the Structure of the Hydrophobic Core in DNA Interacting Proteins Entropy 17(3), 1477-1507.

[3] Van Roey P, Beerman TA. (1989) Crystal structure analysis of auromomycin apoprotein (macromomycin) shows importance of protein side chains to chromophore binding selectivity. Proc Natl Acad Sci U S A 86:6587-6591.

Construction of fuzzy automata for fetermining the structure of the early-stage intermediate in protein folding

Wiśniowski Z.1, Roterman I.1, Kalinowska B.2

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University, Poland

In this paper we present the definition of the fuzzy automata for determining the structure of the early-stage intermediate in protein folding. This construction has been formulated using concept of fuzzy sets and the rules of fuzzy linguistic. The process has been divided in two steps. First step which can be called ‘learning’ is intended to determinate degrees of membership (from the interval [0, 1]) for each tetrapeptide taken from the one of seven structural motifs describing in the in silico model. Second step lets us to try to do automatic classification of an individual amino acid in its position belonging to the one of the seven structural motifs. Fuzzy automata is working analyzing sequential list of variables. It starts in the initial state and due to results of decision function d changes its internal states depending on previous state and variable value. In each step function d gives the fuzzy set of internal automata states as its result. Then automata chooses one internal state as the next state. The rule is to choose the state for which its degree of membership is highest and in next recursive steps are choosing states according to ascending degrees of membership. When automata reaches the last variable and finally changes its internal state it checks whether this state belongs to F (ending state). If not - automata returns back to the previous state and previous variable and recursively repeats its action choosing different way in consecutive steps. It is possible that in some cases all ways will be checked and the automata did not reach ending state. In this case there will be no answer, so that individual amino acid in this position of the chain cannot be assigned to any of these seven classes.

Analysis of lipid-binding proteins in human brain

Rosicka R.1, Roterman I.2, Banach M.2

1Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University, Poland

2Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

Fatty acid binding proteins located in various tissues, despite the small size, have a huge impact on the functioning of cells. In this review, wwe are focusing on little-known human brain fatty acid binding protein (B-FABP), which is mainly present in glial cells. Increased expression of this protein can be seen already during development of the central nervous system but also in the neurodegenerative disorders such as Alzheimer’s disease, Parkinson’s disease or Huntington’s disease. B-FABP exhibits high affinity for the poly-unsaturated fatty acid such as omega-3 fatty acid or omega-6 fatty acid, which are in brain more than in other tissues. More detailed analysis of crystallized structure of human B-FABP should lead to better comprehension of its functions and binding mechanism with poly-unsaturated fatty acid. Analysis based on fuzzy-oil drop model suggests the mechanism of fatty acids binding and specificity of structural feature of these proteins. The crystal structure of fatty acid binding protein of PDB ID 1FE3 which is classified (CATH) as crystalized in complex with ligand – oleic acid and dynamic forms of fatty acid binding protein in apo- form 1JJX are the objects of analysis. The common characteristics of structural specificity of both proteins and ability to bind highly hydrophobic ligand is shown using the structure of hydrophobic core as the criteria for structure-function relation in this protein.

Negative feedback based on simulation of bio-systems

Wach J.1, Bubak M.1,2, Konieczny L.3, Roterman I.4, Chłopaś K.4

1Department of Computer Science, AGH University of Science and Technology, Poland

2Academic Computer Centre Cyfronet AGH, Poland

3Institute of Medical Biochemistry, Jagiellonian University Medical College, Poland

4Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

Computer simulations are contemporarily used as a way to understand bio-systems such as a living cell. As the systems are very complex by their nature, sophisticated methods have been employed: from the simplest one [1] to complex simulators composed of separate modules responsible for different aspects of bio systems [2]. Algorithms focusing on selected physic laws tend to loose details, preventing researchers from for discovering emergent properties of bio systems. On the other hand, simulating all of the details as done by [2] makes modeling of larger systems seems nearly impossible (full model of simple bacteria cell requires more than 1900 parameters and generates more than 500 MB of data in a single run). However human cell is 40 times more complicated (525 vs. 20k protein-coding genes). Therefore, a simplified approach is required. Each and every bio-system is lives in an open environment. To be stable, acting in defined boundaries (homeostasis), a regulation mechanism is needed. Simplest automated regulator known is a negative feedback inhibition system (NFIS). Authors [3] propose to utilize NFIS as a bio-system building block to link the structure with function in form of functional proteome. A single NFIS is composed of an effector responsible for delivering a product and a receptor delivering signal regulating the production. Once a defined concentration of the product is reached (receptor’s threshold), receptors signals to effector stopping the production. Diffusion is employed to lower product’s concentration over time. Once it’s below the threshold, receptor turns the signal off allowing effector to continue production. In order to model full bio-system, NFISs are connected via cooperation and coordination. The first relation is created for components exchanging a substance (product – substrate). Enzymatic cascade, in which one enzyme delivers substrate to another one, like glycolysis, is an example of cooperating components. Current implementation is available at The software is capable of computing concentrations for fairly complex systems, with both cooperation and coordination connections.

Acknowledgement: This research was funded by the PLGrid CORE.

[1] Boryczko K, Dzwinel W, Yuen DA, Dynamical clustering of red blood cells in capillary vessels, J Mol. Modeling, 9, 16-33, 2003

[2] Karr, Jonathan R. et al., A Whole-Cell Computational Model Predicts Phenotype from Genotype, Cell, Volume 150, Issue 2

[3] Konieczny L., Roterman I., Spólnik P., Systems Biology: Functional Strategies of Living Organisms, ISBN 978-3-319-01336-7, Springer 2013

The use of metal-carrying dye markers to reveal the structure of amyloid proteins by EM

Chłopaś K.1, Jagusiak A.1, Konieczny L.1, Piekarska B.1, Roterman I.2, Rybarska J.1, Stopa B.1, Zemanek G.1, Bielańska E.3, Piwowar P.4, Woźnicka O.5

1Institute of Medical Biochemistry, Jagiellonian University Medical College, Poland

2Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

3Institute of Catalysis and Surface Chemistry, Polish Academy of Science, Poland.

4Department of Measurements and Electronics, AGH University of Science and Technology, Poland.

5Department of Cell Biology Imaging, Faculty of Biology, Earth Sciences, Jagiellonian University, Poland.

The self-assembling dye Congo red is generally considered as specific ligand and marker of amyloids revealing by complexation the common intrinsic nature of these abnormal proteins in despite of their different origin and structural features. Amyloidosis is a life threatening systemic disease in which deposits of these abnormal proteins can be found in many tissues. Amyloids have long been the subject of scientific research. Unfortunately however their nature is still largely hypothetic. The reason is that neither amyloids themselves nor their complexes with Congo red undergo crystalization due to the lack of the structural uniformity. Electron Microscopy can potentially address this issue on condition of the enhanced detection of Congo red. This can be effected by embedding metal ions in Congo red micelles. Congo red itself exhibits however rather poor affinity for metal ions complexation including silver ion which seems attractive for the designed purpose in particular. An alternative solution is to attach the ions to other high affinity organic compounds which can then form stable complexes with Congo red by intercalation producing a uniform co-micellar structure. The above criteria are met by the dye Titan yellow binding strongly silver ions. Our work presents the basic properties of this substance and characterizes its complexation property. In this way Congo red playing a role of indirect silver carrier may be revealed in its specific complexes by EM.

Neural Networks as a tool for modelling of biological systems

Tadeusiewicz R.1

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

Neural networks become very popular as a tool for modelling of numerous systems - technological, economical, sociological, psychological and even political ones. On the other hand neural networks are models of neural structures and neural processes observed in real brain. However for modelling of real neural structures and real neural processes occurred in living brain neural networks are too simplified and too primitive. Nevertheless neural networks can be used for modelling the behaviour of many biological systems and structures. Such models are not useful for explain taken into account biological systems and processes, but can be very useful for analysis of such systems behaviour including prognosis of future results of selected activities e.g. prognosis of results of different therapies for modelled illnesses. In paper selected examples of such models and their applications will be presented.

Speaker discrimination based on neural networks and dynamic formant frequencies of Polish vowels

Sałapa K.1, Trawińska A.2, Tadeusiewicz R.3

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Institute of Forensic Research, Poland

3Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

This paper has two goals. Firstly, it verifies the effectiveness of neural networks as a classification tool in speaker identification. Secondly, it try to determine whether any specific context (i.e. triad: consonant-vowel-consonant/pause) carries more discriminatory information than others. The study is based on recordings of ten sentences, each voiced three times by 25 males representing the Lesser Polish dialect, aged 21-23. Recordings were obtained in WAV PCM format (sampling rate of 44.1 kHz, 16-bit depth). Formant frequencies were extracted automatically using the STx software tool published by the Austrian Academy of Sciences, following manual signal segmentation. The validity of automatic formant extraction was assessed by the analyst performing the measurement using spectrographic data along with Fourier and LPC analysis. The advantage of formant frequencies is attributed to the anatomical structure of the speech organ which imposes certain restriction on its use, as well as to articulation habits which are largely conditioned by the speaker’s environment even when subjected to conscious control. The pilot studies showed that the Multilayer Perceptron (MLP) provides more accurate discrimination than the Radial Basic Function (RBF) techniques, because of this in this paper the authors considered only on MLP models. Automatic Network Search was applied with two possible combinations of formant frequencies (F1-F2-F3 and F2-F3) as predictors of membership of 25 groups. Those models were created for each context separately. Next, additional variables was added to each model representing the duration [ms] of the vowel. Efficacy of proposed models vary between 40% to 90% of correct speaker classification. The higher accurate rate is connected with models involving duration of the vowel. All computations were carried out using the STATISTICA Neural Networks v.10 software (StatSoft Inc., Tulsa, OK, USA).

Application of Multilayer Perceptron to identification of Polish-language speakers based on the vowel /e/

Sałapa K.1, Trawińska A.2, Tadeusiewicz R.3

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Institute of Forensic Research, Poland

3Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

The objective of this paper is to gain further insight into the prospects of identification of Polish-language speakers based on the vowel /e/ (stressed or unstressed). This study verifies the hypothesis that stressed (unstressed) /e/ formant frequencies aggregated over various contexts enable less accurate discrimination than stressed (unstressed) /e/ formant frequencies representing a single context. The study is based on recordings of ten sentences, each voiced three times by 20 males representing the Lesser Polish dialect, aged 21-23. Recordings were obtained in WAV PCM format (sampling rate of 44.1 kHz, 16-bit depth). Formant frequencies were extracted automatically using the STx software tool published by the Austrian Academy of Sciences, following manual signal segmentation. The validity of automatic formant extraction was assessed by the analyst performing the measurement using spectrographic data along with Fourier and LPC analysis. The Multilayer Perceptron models of neural networks was used to verify the hypothesis. Two possible combinations of formant frequencies (F1-F2-F3 and F2-F3) representing stressed or unstressed /e/ vowels and the duration of the vowel [ms] were considered as predictors for models created for each context separately. Than models for aggregated data were performed with additional context-related information. The results confirm the verified hypothesis. The authors have determined that the accuracy of classification is greater when based on a single context than for similar input data aggregated over several different contexts. All computations were carried out using the STATISTICA Neural Networks v.10 software (StatSoft Inc., Tulsa, OK, USA).

Interfaces for tetraplegic people – review of solutions supporting activities of daily living

Augustyniak P.1, Mikrut Z.1

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

Currently, a disabled person can participate in social living and proceed with independent individual living with the assistance of intelligent support from technology. The paper reviews various approaches and recent achievements in designing and prototyping of interfaces for tetraplegic people proposed by Biocybernetic Laboratory AGH. After a short review of other available systems, three categories of ADL supporting solutions are presented: input devices, control software and independent systems. Joysticks were selected as preferable input devices, among other tested options (touchpad, tongue manipulator, eyetracker, blow sensor and direct sensor of brain electrical activity). Joysticks are free from hygienic issues, tolerant to accidental use or excessive force and easily operable by chin or mouth. The flexibility required for personalization of input devices has been achieved with dedicated software defining the interpretation of operators’ gestures. This process includes the identification of gesture, its duration and coincidence to adapt the final signal to the specificity of a supported human. The paper also presents two independent systems dedicated to support of selected ADL. One of them is an intelligent interface for infrared-based remote control of home appliances, and the second is a joystick-based emulator of computer mouse. Both devices have learning mode that allows for adaptation to the particular environment (i.e. devices to be controlled) and specific abilities of the human operator.

Direct immobilization of human cytokine antibodies on TiO2 nanotubes array

Arkusz K.1, Krasicka-Cydzik E.1

1Department of Biomedical Engineering, University of Zielona Góra, Poland

So far the direct immobilization of bioreceptors on the surface of negatively charged titania nanotubes (TNT) was only possible in case of positively charged proteins [1-2]. As the monoclonal antibodies (mAb) are negatively charged in human plasma to develop immunosensor on TNT, based entirely on physical adsorption, bioreceptors with the isoelectric point higher than the pH of the buffer or the modification of platform material must have been applied. The aim of our research was to prepare thermally modified TNT array to elaborate platform for direct immobilization of IL-6, IL-8 and TNFα mAb. TNT layers were formed by anodizing of the Ti foil in ethylene glycol solution with 0.6% wt. NH4F and annealed in argon at 550°C for 2h. SEM, EDS and XPS analysis, along with the electrochemical measurements were used to characterize the material. The immobilization of mAb on the TNT surface was performed by the drop technique and the efficiency was evaluated by the spectrophotometric analysis. Thermal modification of TNT allows to convert TiO2 into anatase/rutile mixture, changing its properties into semiconducting material and obtain the positive value of open circuit potential. The results of spectrophotometric analysis confirmed the interaction between the OH groups located on the surface of TNT and the COOH groups of negatively charged mAb. We considered the location and conformation of mAb enabling the formation of antigen-antibody complex. Thermal modification of TNT allowed us to immobilize the mAb by physical adsorption. In this new strategy it is not required to pre-treat probe surface with any kind of blocking buffer.

[1] P. Schmuki et al. MINI-REV MED CHEM.,13, 2(2013)194.

[2] K. Arkusz, PhD thesis, 2014.

Modelling of cortico-subcortical loops

Gorzelańczyk E.1

1Department of Theoretical Basis of Bio-Medical Sciences and Medical Informatics, Collegium Medicum of Nicolaus Copernicus University in Torun, Poland

The concept of cortico-subcortical loops is one of the explanations of the physiological control of the majority of motor, emotional and cognitive functions. The most important elements are striatum and cerebral cortex. Especially in the pyramidal cells of the cerebral cortex and medium spiny neurons of the striatum there is capacity for plastic changes relating to the control of broadly defined mental functions (motor, emotional, cognitive). The cerebral cortex is linked to the striatum via cortico-subcortical pathways, from where information is transmitted to the globus pallidus pars internalis or the substantia nigra pars reticulata (which physiologically and anatomically constitute one structure) or via the ventral globus pallidus reach the thalamus and the cerebral cortex subsequently. The increase in concentration of dopamine in the striatum facilitates the choice of information channels, which will be disinhibited, while the decrease in concentration of dopamine hinders the choice. The reason of errors occurring in the selection of behaviours adjusted to a particular situation or difficulties in completing already chosen behaviours in the Parkinson’s disease, in which concentration of dopamine in the nigrostriatal pathway is decreased, are tried to be explained with the presented model.

Saccadic eye movement abnormalities in Alzheimer’s disease

Walecki P.1, Pasgreta K.2, Kunc M.3, Gorzelańczyk E.2

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Department of Theoretical Basis of Bio-Medical Sciences and Medical Informatics, Collegium Medicum of Nicolaus Copernicus University in Toruń, Poland

3Airedale NHS Trust Steeton, University of Leeds, United Kingdom

There are abnormalities in eye movements in individuals with Alzheimer’s disease, which are related to oculomotor frontal-subcortical circuit dysfunctions. The aim of the study is to compare the parameters of saccadic eye movements in individuals with Alzheimer’s disease with those in older adults without dementia. 31 individuals with mild and intermediate Alzheimer’s dementia (AD) (MMSE> 13) (26 women, mean age 76.8 ± 6.41 and 5 men, mean age 79.1 ± 5.21) and 30 individuals without symptoms of dementia (matched for age) were examined. Two experiments were performed: Latency Trials (LAT) and Reflexive with Gap (RXG). The saccadic eye movements were measured with the use of Saccadometer Advanced. Saccadic latency [ms], promptness [Hz], duration [ms], amplitude [deg], peak velocity [deg/s] and the number of executed saccades were analysed. Statistically significant differences in the number of saccades, latency, promptess between patients with AD and control group. It was found that the level of oculomotor efficiency in mild and intermediate Alzheimer’s disease is significantly lower in relation to older people without dementia.

Quantitative methods of assessment of psychomotor function in opioid-addicted patients

Lasoń W.1, Gorzelańczyk E.2, Walecki P. 1

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Department of Theoretical Basis of Bio-Medical Sciences and Medical Informatics, Collegium Medicum of Nicolaus Copernicus University in Toruń, Poland

In opioid-addicted patients the impaired psychomotor function is observed. Patients have significant difficulty in controlling the performance of movements, in particular, difficulty initiating movement, impaired coordination and reduce the precision and speed. A signal representing the spectral analysis of upper limb tremor can be used as an objective measure supported diagnostic process. 139 opioid-addicted patients were examined twice, immediately before and about 1 hour after oral administration of therapeutic dose of methadone. Among them 77 are HIV-positive and 62 are HIV-negative. The original test implemented on a graphics tablet was used. Designed software allowed an analysis of the motion’s parameters: force levels, the time of the task, speed and acceleration of the plot, the amplitude and frequency of hand tremors. The analysis of opioid-addicted patients shows reduction of upper limb tremor amplitude after administration of single dose of methadone in the drawing task. Analysis of the spectra of tremors indicate a slight deterioration in the stability of the execute motion in HIV-negative patients and statistically significant improvement in the stability of motion in HIV-positive patients (reduction of amplitude variations for all components, p <0.05). A single dose of methadone in opioid-addicted individuals reduces dominant hand tremors. This indicates improvement in graphomotor and psychomotor functions. Administration of a single, therapeutic dose of the substitution will improve stability of the performed motion which has been observed particularly in patients with HIV-positive.

Diagnostic tools for the evaluation of disorders associated with HIV infection

Walecki P.1, Gorzelańczyk E.2, Lasoń W.1, Feit J.2, Pasgreta K.2

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Department of Theoretical Basis of Bio-Medical Sciences and Medical Informatics, Collegium Medicum of Nicolaus Copernicus University in Torun, Poland

The aim of this study is to assess whether and to what extent the cognitive and psychomotor processing in patients addicted to opioids is different in comparison to healthy subjects (transverse study) and what is the short-term and long-term impact (longitudinal study) of the administration of substitute substances (drug-substitution treatment) in patients addicted to opiates. The aim is to improve the original (copyright) psychomotor assessment battery both technically – the calculation of new parameters in the non-linear variables and clinically - finding the psychomotor and cognitive markers in opiate addicts, in particular on the basis of the results obtained in the research and study of eye movements. We compare the cognitive and psychomotor functions in patients addicted to opioids both HIV-positive and HIV-negative. Identifying the characteristic pattern of psychomotor disturbances (graphomotor disorders, abnormal eye movements) can be a valuable diagnostic tool that supports clinical and neuropsychological evaluation of effectiveness of pharmacotherapy and psychotherapy.

Model of activity dorsolateral prefrontal cortex (DLPFC) in saccadic eye movements

Walecki P.1, Lasoń W.1

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

The dorsal attention system is located bilaterally in the dorsolateral prefrontal cortex (DLPF)C and is associated with descending volitional processes of visual control. Within the studies were analyzed characteristic patterns of neural activity related with the cognitive processes in saccadic refixations, as well as activity of DLPFC, depending on the performance of externally guided saccades (prosaccades) and the internally guided saccades (antisaccades). In tasks that require conscious visual control latency is significantly longer in the antisaccades task than in those of prosaccades. Based on the results can be concluded that the DLPFC plays a key role in the volitional saccadic refixations, especially in planning and visual attention.

Eye movement measurement in the diagnosis of ADHD/HKD

Walecki P.1

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

The ADHD/HKD is described vary widely in the clinical symptomatology probably due to the complex pathogenesis, which takes into account both the neurophysiological dysfunctions, as well as genetic factors. To obtain objective diagnostic criteria allowing for identification of markers of ADHD/HKD is necessary understand the biological causes and pathophysiology of these disorders. The development of brain imaging techniques has contributed to the focus of research of neuroanatomical and biochemical causes of ADHD/HKD. The results confirmed that ADHD/HKD is a disorder characterized by neurophysiological impaired certain brain structures. In presented study were shown eye movement abnormalities in ADHD/HKD. Examination eye movement disorder can be an effective diagnostic method, because some areas of the brain responsible for control of these movements is an area well connected in the pathogenesis of ADHD/HKD, these are the basal ganglia (BG) and dorso-lateral prefrontal cortex (DLPFC).

From Hippocrates to statistics in medicine. An historical perspective

Gryglewski R.1

1Department of Medical History, Jagiellonian University Medical College, Poland

The purpose of this presentation is to show in the historical perspective and on the selected examples transformation of an biological model of life functions which is clearly present in the medical theories from times of Hippocrates until present days. Hippocrates introduced to the art of medical practice some level of philosophical reflection and by that raised it to the level of science and by that creating medicine. We can say that by shaping his theory of life which was based on the principle of balance of four liquids, Hippocrates created de facto first model of living organism; in its proper physiological condition, as well as in its pathological symptoms. The influence of the Hippocratic thought on the development of European medicine was significant, and became an inspiration for many generations of physicians seeking for a final definition of health and disease. Santorius physiological experiments, Sydenham’s nosological system, concepts of iatrophysics and iatrochemists or Boerhaave’s clinical teaching: all those can be seen as a transformation of the original Hippocratic model. There was a constant need for more and more sophisticated tools to fulfil this aim. Introduction of statistical methods in the 19th century medicine opened new ways for scientific investigation; the numerical method used by French physician Pierre-Charles-Alexandre Louis was the very first step in that direction. It was the curtail move towards modern clinical trials. From historical perspective we clearly see how statistics played important role in the formation of pathological and epidemiological models, which date back to Semmelweis’s work on puerperal fever or had it full impact on development of the EBM model.

Homology modeling of the human sodium dependent glucose transporter hSGLT1 (SLC5A1)

Kłys M.1, Kraszewski S.1

1Faculty of Fundamental Problems of Technology, Wrocław University of Technology, Poland

According to the theory of Otto Warburg [1], the prime cause of the cancer is the replacement of aerobic respiration present in normal cells by sugar fermentation. It was found that the human SGLT1 is expressed in colorectal, gastrointestinal and kidney tumours [2]. Therefore, investigation of this transporter can be crucial in development of new kind of cancer treatments, which involve membrane protein inhibitions. Due to the lack of experimentally obtained human SGLT1 structure, we used spatial structure of the vSGLT (bacterial one), which has the identity of amino acids sequences at 32 % (60% of similarity ), and the MODELLER software, to obtain the human SGLT1 model. Model preparation was supported by protein secondary structure prediction softwares as: HMMTOP, DAS, TMHMM, TOPPRED2, MEMSAT-SVM, MEMSAT3, PSIPRED, RAPTORX, NETSURF. The investigation of different amino acids sequences alignment between vSGLT and hSGLT1 required to produce an accurate model by MODELLER software was made. On this basis, obtained model, which satisfied the expectations of: the lowest energy, the best DOPE score profile for the model and template, and passed structural analysis, was chosen for further molecular dynamics (MD) study. A system that contains the sodium channel human SGLT1 embedded in a fully hydrated membrane was run using MD. The stability of the system in time was measured by root mean square deviation. The human SGLT1 binding sites of sugar and sodium was compared with binding sites of vSGLT to validate obtained model.

[1] Warburg O., Science (1956), 123/3191; 309-314

[2] Wright E.M., Loo D.D.F., Hirayama B.A., Physiological Reviews (2011), 91/2; 733-794

Simulations in the diagnosis and treatment of diseases of the hip and knee

Dygut J.1, Kuchta M.2

1The Regional Hospital St. Padre Pio in Przemyśl, Poland

2Faculty of Electronics, Military University of Technology, Poland

With increased concerns about patient safety comes an increase in the application of new technologies in medical education. This is undoubtedly fostered by the development of electronics, computer science, engineering, and materials associated with continuous advances in medicine. Today the use of medical simulators and computer simulation programs used to train various skills such as improvement techniques is expanding rapidly, for example in surgery. Thanks to these technologies before the doctor starts to work in a clinical environment he has a chance of overtraining certain skills in a non-clinical environment. Orthopedics and rehabilitation in this context deserve a special mention, because in many areas it has initiated the development of computer simulation programs and simulators in design research, diagnostics and operations. This is due to the fact that these departments of medicine by their nature are considering a person in full, is strictly based on the statics and dynamics of human skeletal muscle, and not only to the individual organs. At present, the development of orthopedics means that residents, in a much shorter period of time, must acquire the knowledge and skills that their older colleagues gained over the years, while learning as they performed surgery on patients. As a result, alternative teaching methods, such as training on simulators and computer simulations are becoming more desirable in the training of residents, as well as refresher training for senior staff in the event of deployment, eg. New surgical techniques. The traditional, yet effective, training model in medicine is based on the principle of “master-disciple” and involves teaching on patients by the old adage: “see, do, teach the next”. Changing the current rules of the organization of work in hospitals could cause a rupture in the bond between trainer and trainee. To cope with this situation, it is necessary to introduce a new model of education. In the scientific literature much has been written in many places that a change of paradigm in the education of orthopedic surgeons is taking into account the integration of new technologies into teaching profession. Synthetic training will then be necessary. The use of simulation in medicine, as well as in aviation, will become a standard. The point is is that thanks to the acquired skills in this way, it is possible to increase the level of patient safety, learn teamwork, and improve communication with the patient. Attention has been focused on the graphic and spatial presentation of examples of equipment, computer programs and electronic materials used in e-learning simulations related to the issue of diagnosis and treatment of diseases of the hip and knee. The key to the preparation of the poster was to review the current state of knowledge in this field and its spatial representation. The simulation and simulator issues are depicted on the poster in two aspects: (I). Techniques, i.e. the use of the device to mimic another device (imitator), artificially reproducing the properties of an object or course of specific processes. The concept of simulation in the technical aspect is so synonymous with a specific device, ie. simulator. (II). Computer science - the application of a particular computer program to reproduce the performance of certain equipment or specific processes. The notion of simulation in the aspect of it is so closely associated with the concept of computer simulation, which is the study of the behaviour of real objects on the basis of the actions of computer programs which simulate this behaviour.

Modeling of the dynamics of the knee joint in the simulink

Biaduń M.1, Kuchta M.1, Dygut J.2

1Faculty of Electronics, Military University of Technology, Poland

2The Regional Hospital St. Padre Pio in Przemyśl, Poland

Computer modeling and simulations are commonly used methods of researches in many areas of a science, including the biomechanics. To development of these methods contributed a rapid development in computer technology, both in hardware and software. A range of their applications includes assisting in the processes of biomechanics, sports training and rehabilitation. Simulink is the program that is used to computer modeling. It is a part of a mathematical package the MathWorks MATLAB. This is an interactive software package intended for modeling, simulation and analysis dynamic systems. Simulink is equipped with an intuitive user interface. It contains a special library of block diagram, and allows to create components customized to specific needs. The paper presents a mathematical model of the knee joint, which was developed in Simulink. The model was used for research of its parameters under dynamic conditions.

Dynamic’s researchers of a knee joint for different types of loads

Biaduń M.1, Kuchta M.1, Dygut J.2

1Faculty of Electronics, Military University of Technology, Poland

2The Regional Hospital St. Padre Pio in Przemyśl, Poland

A modern clinical and sports medicine increasingly uses training and diagnostic equipment. They allow to monitor a training process, a cycle control and an evaluation of its basic parameters. These devices are used to carry out the training process for healthy people as well as rehabilitation of persons with impaired organ movement functions. Developed measuring properties allow to define many parameters and biomechanical functions. They can estimate a state of a particular biomechanical group and monitor its an efficiency for a long period of time. The main aim of the researches was to determine an impact a type of the load on the dynamic parameters during straightening of a knee joint. It compared parameters of the sick and healthy limbs under the influence of the use a particular load. The article presents results of measurements dynamic parameters of a knee joint. Measurements obtained during using different types of loads for healthy people. The second part of the paper concerns the comparison of functioning a sick and healthy limb of the test person.

An application of the automatic speech recognition to estimate contents of meals in order to calculate insulin dosage in diabetic patients

Foltyński P.1, Ładyżyński P.1, Pankowska E.2, Wójcicki J.1

1Polish Academy of Sciences, Nałęcz Institute of Biocybernetics and Biomedical Engineering, Poland

2The Institute of Mother and Child, Poland

Proper control of postprandial glycemic excursions constitutes an important problem in diabetes treatment. Typically, patients with diabetes determine nutritional value of a meal using their judgment and experience to determine an appropriate amount of insulin compensating the nutritional contents of the meal. The aim of the work was to design and develop the system that calculates insulin doses compensating meals based on the verbal description thereof. We developed an algorithm for determining insulin dose related to a meal comprising the steps of: (1) processing the verbal description of the meal by an automatic speech recognition (ASR) algorithm to change it into a textual description, (2) analysis of a textual description of meal to determine its nutritional value (the developed algorithms are language-independent), (3) determining a complementary dosage of insulin based on the content of the meal and its nutritional value. The designed method was implemented in a system, called VoiceDiab, consisting of: the smartphone application, the ASR server, the meal analyzing server and the insulin dose calculator. Two types of the system were developed, i.e. the client-server system and the stand-alone system. Preliminary tests of the client-server system in a group of patients with diabetes demonstrated that it was able to properly recognize 98.5% food products. Taking into consideration that usage of smartphones is widespread and that ASR algorithms are getting more and more effective we conclude that VoiceDiab might become an useful tool in patients with diabetes who are not properly educated to effectively control postprandial glucose changes.

The role of thermal models in evaluation of wound healing processes

Nowakowski A.1, Kaczmarek M.1

1Department of Biomedical Engineering, Gdansk University of Technology, Poland

We developed a new method and instrumentation for Infra-Red (IR) thermal imaging we call Active Dynamic Thermography (ADT). Last few years we try to prove the value of this method in several medical applications as evaluation of skin burn treatment, cardiosurgery wound healing and others. Crucial for understanding of healing processes and it’s quantitative description are thermal models of tested tissues. The ADT belongs to the group of model based methods, that use quantitative approach to show objectively changes of applied descriptors. As major procedure thermal transient processes are observed. Using cooling excitation in the form of the step function thermal properties of tested tissue at the region of interest (ROI) are determined. Both, advanced 3-D thermal models of tested ROI as well as the simplest two-exponential analytical expression are developed for description of the same biological structure. Thermal time constants are practically sufficient for quantitative characterization of affected tissue and prediction of it’s healing. During last 10 years several in vivo study series with animals, totally more than 100 pigs, as well as clinical cases, ranging ca 400 cardiosurgery patients allow to show high value of the proposed procedures. The main outcomes of these studies are new diagnostic methods. One which allows to propose proper treatment of skin burns, showing at the second day after burn the region demanding surgery intervention. The second one, allowing to evaluate post-cardiosurgery wound in terms of possible complications.

Acknowledgement: This work was partly financed by the grant NCN UMO-2011/03/B/ST7/03423.

The multifractal analysis of the kinesiological surface electromyography signal

Trybek P.1, Machura Ł.1, Nowakowski M.2

1Department of Theoretical Physics, University of Silesia, Poland

2Jagiellonian University Medical College, Poland

We propose particular approach of analysis of the kinesiological surface electromyography signal (sEMG) based on the method concerning the scaling properties of the time series namely Multifractal Detrended Fluctuation Analysis. We would like to evaluate the usefulness of the surface electromyography (sEMG) in the assessment of the level of neuromuscular activation during complex movements on the laparoscopic trainer. The goal of this study was to investigate the difference between the signal obtained from the person unskilled in the field of using laparoscopic tools and the same person after the series of training. The volunteers (equal gender distribution, 24-27 years of age) were recruited for an experiment on laparoscopic trainer. Novice users had to tie surgical knots in the allotted time. The measurement was performed for complete beginner and repeated after the training series. Muscular activity was recorded from the four groups of muscles on each arm with using the bipolar concentric surface AgCl electrodes. The measurements were conducted with the 8 channel surface EMG recorder. Qualitative analysis of the complex motor task require an appropriate tools, therefore we decided to apply the method which does not focus on the traditional statistical tests. The analysis is based on the parameters describing the multifractal spectrum. The mean values of the Hurst exponent and the spectrum width were calculated. The statistically significant differences (p<0.05) were found for a selected group of muscles. The proposed method of multifractal analysis - MDFA can bring us closer to the understanding of the true nature of the process during complex movements.

A method of predicting the secondary protein structure based on dictionaries

Fabian P.1, Stąpor K.1

1Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, Poland

The shape of a protein chain may be analyzed at different levels of details. In many cases a description of the local shape, the secondary structure, is enough to find out some properties of proteins. The problem of finding the precise 3D shape given only the amino acid sequence is very complex. But the secondary structure may be found even without having the full 3D information. Many methods have been developed for this purpose, usually based on similarities to other proteins with known shapes. The presented paper proposes a method based on dictionaries of known structures. The dictionary method has been developed based on data available in the PDB database. It takes into account a large number of protein chains (tens of thousands) and tries to recreate the analyzed chain from subchains found in the database. The corresponding secondary structure is assumed to be preserved in the analyzed chain. All tests have been performed ‘in silico’ with the PDB database as the source of both the training and testing data. The overall performance of the method depends heavily on the size of the dictionary. Experiments have been performed with testing sets not similar to any elements in the training set. To speed up the processing, associative arrays implemented as hash tables have been used. Accuracies of up to 79% percent have been achieved. Results of the basic method are of course not as good as of sophisticated methods involving complex alignment algorithms, evolutionary information etc. but the method may be improved by modifying the way how subchains are compared (allowing approximate matching) and how the statistics is collected.

Mixed spatial evolutionary games

Świerniak A.1, Krześlak M.1

1Institute of Automatic Control, Silesian University of Technology, Poland

We propose a new scheme of modeling spatial interactions between cancer cells based on the idea of Spatial Evolutionary Games. Our approach takes explicitly into account heterogeneity of cancer cells which leads to the conclusion that cells do not represent selected phenotypes (strategies) but rather their mixture. To our knowledge first paper in which spatial evolutionary games are used to model spatial effects of interactions between cancer cells is [1]. The games (SEGT models) are played iteratively on a lattice forming torus. The following steps are performed every iteration: payoff updating - the sum of local fitness in the neighborhood, removing players - cell mortality, reproduction - defining which phenotype will be on an empty place. We propose to describe the strategies as a mixture of phenotypes by frequency of occurrences. The spatial games resulting from this assumption will be called mixed spatial evolutionary games (MSEG). Modification of the way spatial games are used requires the change in definition of the local fitness. It is defined in a way similar to an expected result of the game with mixed-type strategies. The result given by each pair of strategies is multiplied by their frequency of occurrence. The analysis is more complex, due to an increased number of feasible spatial structures. In our study we compare simulation results for different models based on spatial evolutionary games describing interaction of cancer cells [2].

[1] Bach, L. et al.: Spatial evolutionary games of interactions among generic cancer cells. J Theor Med 5, 47-58(2003)

[2] Swierniak, A., Krzeslak, M.: Application of evolutionary games to modeling carcinogenesis. Math Biosci Eng 10(3), 873-911(2013)

Breast deformation modelling based on mri images - case study

Danch-Wierzchowska M.1, Borys D.1, Świerniak A.1

1Institute of Automatic Control, Silesian University of Technology, Poland

Recent studies indicate growing trends of breast cancer incidence. To obtain crucial diagnostic information two types of examinations are performed: MRI (Magnetic Resonance Imaging), which shows the structure of treated tumour and PET (Positron Emission Tomography), where metabolic active structures are marked. Above mentioned two types of examination are performed in different patient position (MRI - prone, PET - supine) what makes them difficult to compare. The main goal of this study is to create deformation method, which would allow to merge images from MRI and PET diagnostic procedures. Resulting image is created as in supine position which would be more informative for the surgery. Developed deformation method takes advantage of Osirix and Ansys software. First step of the method is breast tissues segmentation from MRI images. Based on ROIs detected in Osirix software, spatial mesh is created and as solid imported into Mechanical ANSYS. Next, deformation of the model due to gravitational forces is simulated. Obtaining results are compared with PET images. Based on recent results the method is precise concerning tumour dislocation and the best mapping is obtained along gravitational force axis. However there are some inaccuracies while deforming body surface. Described method occurred to be more precise than commonly used non-rigid registration methods, yet further development is still needed. Current results were described as satisfactory by MD cooperating in this project. Next step in this work would be interpolation of MRI images onto deformed mesh created in the process described above. Another field of improvement is method automation.

Acknowledgement: Grant BK-265/RAu1/2014 Zad3

Modelling cardiovascular hemodynamics during the Valsalva maneuver

Pstraś L.1, Thomaseth K.2, Waniewski J.1

1Institute of Biocybernetics and Biomedical Engineering, Polish Academy of Science, Poland

2Institute of Electronics, Computer and Telecommunication Engineering, Italian National Research Council, Italy

The Valsalva maneuver used as a clinical autonomic test features a complex cardiovascular response with a concomitant action of several regulatory mechanisms whose nonlinear interactions are difficult to analyse without the aid of a mathematical model. The aim of this work was to develop a relatively simple model of this maneuver (simpler than the existing models) yet providing a good representation of the corresponding hemodynamic response and thus enabling the assessment of different regulatory mechanisms. A new non-pulsatile compartmental model of the cardiovascular system was developed with a variable intrathoracic pressure and nonlinear functions to describe pressure-volume relationships of veins, cardiac output dependence on preload and afterload as well as three baroreflex mechanisms acting on heart rate, systemic resistance and venous unstressed volume. The model was validated on data from past Valsalva tests from Padua Hospital in Italy. The proposed model fits well clinical data from patients with both typical and abnormal hemodynamic response to the Valsalva maneuver. The impact of individual baroreflex mechanisms in different phases of the maneuver is clearly visible. The model is most sensitive to the parameters describing the pressure-volume relationship of systemic veins. The model provides a useful tool for analysing the interactions between the cardiovascular system and autonomic regulatory mechanisms and for the interpretation of the Valsalva maneuver results. With certain improvements, the model could be used for the analysis of other reflex tests or different cardiovascular perturbations.

Human body posture as a source of information about various diseases

Długosz M.1, Kurzydło W.2

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

2Institute of Physiotherapy, Jagiellonian University Medical College, Poland

Human Body Posture (HBP) is a parameter assessed by almost all physicians and physiotherapists or other professionals but most commonly in clinical practice HBP is analyzed only in a subjective visual assessment. The aim of the study is to show the difference between body posture of healthy people and those suffering from specific disorders - both those clearly connected with HBP (scoliosis, coxarthrosis) and those seemingly unrelated (eg. depression). The study was conducted based on the results of photogrammetric measurements of patients standing in a relaxed posture, scanned with a PBE system (Photogrammetrical Body Explorer). Research included a group of 190 people aged 13 to 88 years. Patients were divided into 6 subgroups based on diagnosed disease entity: coxarthrosis (33 people), discopathy (16 people), scoliosis (25 people), depression (33 people) and chronic fatigue syndrome (47 people). A control group constituted of 36 healthy volunteers aged 19-29 years, with no identified defects of body posture. To evaluate the differences between the HBP of healthy and sick people, an HBP model based on 29 parameters describing the HBP in three anatomical planes was created. The research showed significant differences in the HBP of healthy and sick people. In addition, patients with depression differed considerably from the others. The results of the analysis indicate that an objective assessment of the HBP can be a source of relevant information on the general health of the patient and may be a useful tool in the diagnosis of various diseases.

The human iron exporter ferroportin. Insight into the transport mechanism by molecular modelling and site-directed mutagenesis

Bonaccorsi di Patti M.1, Tortosa V.2, Musci G.3, Polticelli F.2

1Department of Biochemical Sciences, Sapienza University of Roma, Italy

2Department of Sciences, Roma Tre University, Italy

3Department Biosciences and Territory, University of Molise, Italy

Ferroportin (Fpn), a membrane protein belonging to the Major Facilitator Superfamily of transporters, is the only vertebrate iron exporter known so far. Several Fpn mutations lead to the so-called ‘ferroportin disease’ or type 4 haemochromatosis, characterized by two distinct iron accumulation phenotypes depending on whether the mutations affects the protein’s activity or its degradation pathway. Through ab initio molecular modelling techniques, structural models of Fpn in the three mechanistically relevant conformations (inward-open, occluded and outward-open) were built. Analysis of the models led to the identification of potential iron binding sites in the inward-open and occluded states whose relevance was tested through measurement of iron binding and export efficiency of wild type and mutated Fpn. In line with modelling analysis, aspartates 39, 181 and 325, and Arg466 were found to play a major role in iron export. Further, analysis of the outward-open model of Fpn allowed to provide a mechanistic explanation for pathogenic mutations observed in type 4 haemochromatosis patients. Based on these results, a possible mechanism for iron export will be proposed.

Integration of omics data to unravel healthy prostate response in broccoli intervention study

Jurkowski W.1

1The Genome Analysis Centre, Norwich Research Park, Norwich, NR4 7UH, United Kingdom

Epidemiological studies of cruciferous crops (Brassica oleracea) consumption show reduction in incidence and progression of prostate and colon cancer among the others. Cabbage, broccoli and cauliflower are source of organo-sulphur compounds: S-methyl cysteine sulphoxide and glucosinolates. Metabolism of glucoraphanin in gut lumen leads to sulforaphane (SF), exerting mild pro-oxidant activity required to induction of phase 2 and antioxidant genes that in result reduces level of inflammatory cytokines. We are conducting diet intervention prostate cancer study to elucidate effects of increased levels of SF on molecular phenotype in healthy prostate tissue. The aims of this pilot study are to provide for the first time evidence in humans for the critical role of SF in regulating metabolism, and to identify the underlying mechanisms by which SF can reduce the risk of aggressive prostate cancer. For the sake of improved interpretation of simultaneous gene expression and metabolism response we are challenging state-of-the-art methods to integrate omics data. In integration of multiple data types e.g. transcriptomics and metabolomics, the typical statistical approaches as canonical correlation analysis (CCA) are failing due to small number of observations (e.g. conditions, diet etc.) in comparison to data size (number of genes, metabolites). Modifications currently available are not ideal due to the need of adding simulated data resulting in lack of p-values computation or by pruning of variables hence losing potentially valid information. Our approach reduces the complexity by making use of functional relationships in the data. The workflow includes functional subdivision of data sets to reach the expected data structure, statistical analysis within groups and initial interpretation of results. Including gene transcription data can effectively help in interpreting variation in concentrations of key metabolites. By applying pathway and network analysis, data obtained by various platforms (e.g. microarrays) are grouped with moderate stringency to avoid functional bias. As a consequence classical CCA and other multivariate models can be applied to calculate robust statistics and provide easy to interpret associations between lipids and corresponding genes. We are able to demonstrate that our approach improves detection of metabolite function related genes, in comparison to applying rCCA or PLS statistics alone.

Modeling figure/ground separation with spiking neurons

Ebner M.1, Hameroff S.2

1Institut für Mathematik und Informatik, Ernst Moritz Arndt Universität Greifswald, Germany

2Departments of Anesthesiology and Psychology and Center for Consciousness Studies, The University of Arizona, USA

The human brain routinely performs figure/ground separation like picking up a cup on table, or grasping an apple from a tree. Figure/ground separation is used not only in vision, but in separating voice from chatter, touch and other modalities, and remains a challenge in artificial intelligence and machine learning. To approach this problem we have shown figure/ground separation in laterally-connected sheet of ‘integrate-and-fire’ spiking neurons. Biologically, the lateral connections and sheet are based on ‘gap junction’ electrical synapses between dendrites and soma of brain neurons, e.g. cortical layer 5 pyramidal cells and interneurons. The method is based on locally computing the average integration potential for each neuron using lateral connection inputs as well as direct sensory inputs. We show results for artificial as well as real world images in which the model adaptively extracts a figure from background, irrespective of actual numerical value of the figure’s pixels. Lateral modulation enables collective integration and synchronized firing/spiking of large groups of neurons. As lateral connections open and close, a zone of integration moves through the larger system. With further development, such ‘mobile zones’ offer models for executive agency, causal action, attention and correlates of consciousness in intelligent systems.

Patent protection of computer-implemented inventions

Pawłowski A.1


Inventors in IT sector are often faced with a problem: does copyright protect our software adequately or should we seek broader protection? The presentation explains the key differences between copyright and patent protection to allow the inventor taking a well-founded decision on how to protect the intellectual property in software. First, the types of computer-implemented inventions are discussed to present the concept of “technical” and “non-technical” inventions. These concepts are treated differently at the Polish and European Patent Office, and the differences are explained to help selecting the office at which the patent application should be filed. Next, the patenting process is explained, with tips on how to speed up the process to be granted a patent before the software becomes out of date. Examples of patented software algorithms are presented.

Computer simulation of the regulation mechanism in metabolic diseases

Świerkosz A.1

1AGH University of Science and Technology, Poland

Diabetes mellitus is a group of metabolic diseases caused by malfunction of blood sugar regulatory processes and has been reported as related to 8.3% of adult population i.e. nearly 400 million people worldwide. This paper provides a review of facts and principles important for understanding the regulation mechanisms and the role of insulin. The author relies on mathematical modeling of these mechanisms and provides few formulas and computer applications dedicated for diabetes’ use. The modeling aims to find a correct dose of insulin as a response to a series of measurement results on glucose concentration. In conclusion the author recommends selected methods for personal self-check of glucose level and stresses on the importance of regular checking the blood-related parameters.

Brain-computer interface for wireless remote control

Dindorf R.1, Woś P.1

1Department of Mechatronic Systems, Kielce University of Technology, Poland

The conducted laboratory tests confirm that various brain electrical signals - bioelectrical signals (BES) may be used in wireless remote control process. Brain-computer interface (BCI) translates these signals into outputs that allows users to communicate without participation of peripheral nerves and muscles. The natural biosignals generated by brain, facial muscles and eye muscles read by the NIA (Neural Impulse Actuator) are translated into control commands in the controller of pneumatic servo system. Biosignals (EEG, EMG and EOG) detected by means of special forehead band with three sensors are sent to the actuator box, where they are interpreted as control signals. The control signals from the actuator box are transmitted via a wireless WiFi network to the controller of servo-drive. The control operator has a band on their head with three electrodes, which record bioelectrical signals generated by the brain, face and eye muscles. The signals are then enhanced by the Neural Impulse Actuator (NIA), fed into a Wireless Network Interface Controller (WNIC) and analysed by appropriate software which is included with the device. The precision of position control of the electro-pneumatic servo drive may be increased by the operator undergoing appropriate training. As a result of the experiment the positional control characteristics of electro-pneumatic servo mechanism for different input signals were obtained. Electrical activity signals generated by muscle movements (eyeball movement, clenching of teeth) are the easiest to use. The bioelectrical signals subjected to appropriate training control may be used in pneumatic servo system requiring significant precision position.

Performance characteristics of J-PET detector simulated using GATE package

Kowalski P.1

1Świerk Computing Centre, National Centre for Nuclear Research, Poland

Novel PET system based on plastic scintillators is developed by the J-PET collaboration [1, 2]. In order to determine performance characteristics of built scanner prototype, advanced computer simulations must be performed. Simulations are performed using GATE software (Geant4 Application for Tomographic Emission) in CIŚ cluster (Świerk Computing Centre at National Centre for Nuclear Research). Performance of any PET scanner may be evaluated inter alia using dependencies between true, scattered and accidental coincidences [3]. Results of studies of such coincidences for the J-PET scanner will be presented.

[1] P. Moskal, et al., Nucl. Instr. and Meth. A, 775 (2015) 54-62.

[2] P. Moskal, et al., Nucl. Instr. and Meth. A, 764 (2014) 317-321.

[3] P. Kowalski et al., Acta Physica Polonica A, (2015), in print.

Improving access to expert consolation with tele-dermoscopy and tele-confocal: an introduction to cloud based communication system

Łudzik J.1

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

Skin tumours are the most frequent cancer in the general population with a rising trend. Dermoscopy enables more accurate diagnosis than traditional naked-eye examination and has been shown to be improved by information provided by reflectance confocal microscopy (RCM). Teledermatology has traditionally allowed for the exchange of medical information across the internet and the advent of new technology enables nearly immediate access to expert consultation without the limitation of distance based on store-and-forward (SAF) technology with the upload, download, and review of digital dermoscopy images. With the advancement of cloud based telemedicine systems, developed in the United States and Germany, and high speed internet connection data can now be streamed instantaneously allowing for the transmission of larger size file formats that can be used in various applications. We now have the ability to efficiently add reflectance confocal microscopy into a telemedicine platform including the supervision of a dermatology department skin cancer screening workflow from one computer workstation and to have potentially immediate access to experts at any location via simultaneous upload and streaming. A new model of high technology skin cancer screening will be presented integrating the diagnostic workflow applied with SAF system called Vivanet®.

Novel J-PET plastic scintillator with 2-(4-styrylphenyl) benzoxazole as a wavelength shifter

Wieczorek A.1

1Department of Nuclear Physics, Jagiellonian University, Poland

Plastic scintillators are the key part of positron emission tomography device which is being constructed at the Jagiellonian University. Experiments confirmed that low detection efficiency and slight probability of photoelectric effect do not prevent the use of plastic scintillators as detectors for 511 keV gamma quanta [1] The novelty of the J-PET scintillator concept lies in the application of 2-(4-styrylphenyl) benzoxazole as a wavelength shifter [2] Properties of the J-PET scintillators will be presented and discussed in the context of the positron emission tomography.

[1] P. Moskal et al., Nucl. Instr. Meth. A764 (2014) 317.

[2] A. Wieczorek et al., Patent Application No. P409387 (2014).

Comparative studies of FE boards for use of J-PET scanner

Niedźwiecki S.1

1Department of Nuclear Physics, Jagiellonian University, Poland

The Jagiellonian Positron Emission Tomograph (J-PET) prototype based on plastic scintillators is currently under development at the Jagiellonian University (see e.g. [1,2]). Each J-PET module consists of long polymer scintillator strip with fast photomultipliers connected to each end. In order to take advantage of the very good timing properties of plastic scintillators, a dedicated front end electronics need to be designed. Three different approaches to probe signal shapes will be presented: multi-constant threshold, multi-constant fraction and multi-digital constant threshold [3]. For each of these probing methods results from tests and performed measurements will be presented.

[1] P. Moskal, Sz. Niedźwiecki et al., Nuclear Inst. and Methods in Physics Research A 775 (2015) 54-62

[2] P. Moskal, Sz. Niedźwiecki et al., Nucl. Instr. and Meth. A 764 (2014) 317-321

[3] M. Pałka, Sz. Niedźwiecki et al., Bio-Algorithms and Med-Systems Volume 10, Issue 1, Pages 41-45 (2014)

Branched IFS models with positioners for biological visualizations

Stępień C.1, Prolejko M.2

1Institute of Computer Science, Warsaw University of Technology, Poland

2Faculty of Mathematics and Computer Science, University of Warmia and Mazury in Olsztyn, Poland

While researching new algorithms for computer graphics, we focused on the ones that are useful in modeling biological formations. Iterated Functions Systems (IFS) are commonly used to visualize fractals or three-dimensional self-similar objects. The brevity is their main advantage. In the simplest cases, it is enough to define the shape of the base module and the set of transformations in order to create multi-modular object. In the literature, models of shells, horns and beaks are described. To model more complex formations, the modified method is applied wherein parameters of the transformation depend on the number of an iteration. The presented method combines IFS with a new approach to modeling compound objects, that uses positioners. The positioner itself and its possible applications were described in papers that do not refer to IFS models. We will show how to use positioners with IFS models including branched ones (as models of the bronchial tree). The achieved models can be simplified or more accurate depending on the variant of the algorithm. Thanks to the positioners, these models have a continuous lateral surface regardless of used shape of cross-section. The algorithm is described along with requirements for a base module to accomplish such characteristic. It is indicated that positioners simplify the work of a graphic designer. Obtained models of bronchial trees can be used, e.g. in 3D interactive visualizations for medicine students.

Characterization of the spatial resolution of the J-PET detector

Kubicz E.1

1Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University, Poland

A prototype of the positron emission tomography device based on plastic scintillators is currently being developed at the Jagiellonian University [1-3]. Point Spread Function (PSF) determined for the detector with different scintillator shapes will be shown. PSF can be used both to determine and to improve spatial resolution in PET scanners [4]. Spatial resolution of the J-PET prototype was determined using two 22Na radioactive isotopes placed inside the detector.

[1] P. Moskal, S. Niedźwiecki et al., Nucl. Instr. and Meth. A764 (2014) 317.

[2] P. Moskal, N. Zoń et al., Nucl. Instr. and Meth. (2015) A775 (2015) 54.

[3] L. Raczyński et al., Nucl. Instr. and Meth. (A764 (2014) 186.

[4] V. Bettinardi et al., Med. Phys. 2011 38 (10)

Virtual Patients in Massive Open Online Courses: a technical toolbox

Kononowicz A.1, Stathakarou N.2, Berman A.3, McGrath C.2, Bartynski T.4, Nowakowski P.4, Malawski M.5, Zary N.2

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Department of Learning, Informatics, Management and Ethics, Karolinska Institutet, Sweden

3Department of Clinical Neuroscience, Karolinska Institutet, Sweden

4Academic Computer Center AGH Cyfronet, Poland

5Department of Computer Science, AGH University of Science and Technology, Poland

Massive Open Online Courses (MOOC) are educational events on the Internet gathering thousands of participants. This form of disseminating knowledge gained also considerable interest for health-related topics. Methods are sought to retain learners’ attention in the courses by introducing virtual patients. A technical toolbox is needed to handle such extensions. Karolinska Institutet joined the edX MOOC initiative. One of the first courses offered was ‘KIBEHMEDx: Behavioural Medicine – a Key to Better Health’. The course was extended by two branched, multimedia-rich virtual patients implemented in the Open Labyrinth system Around 20,000 participants enrolled in the course. In order to handle the computational load the virtual patient platform was deployed as an atomic service on the VPH Share infrastructure with the use of Atmosphere cloud management system. We extended the Open Labyrinth system by an interface enabling Single-Sign On. The navigation patterns of participants were visualized by a bespoke analytic tool. Discussion and conclusions Handling massive interest in MOOCs involving virtual patients requires a technical toolbox consisting of integration interfaces, virtualization platform handling the computational load and learning analytic tools. Our work is being continued by streamlining and extending the toolbox’s functionality. Needed are standard-based mechanisms for recording and harvesting data on participants’ experiences, interfaces for embedding of more complex computational models and tools enabling interaction with other elements of the main MOOC platform.

Biomaterials for hip implants

Choroszyński M.1,2, Skrzypek S.2, Choroszyński M.3

1Scientific Metal Treatment Co., Roselle, USA

2Department of Physical Metallurgy and Powder Metallurgy, AGH University of Science and Technology, Poland

3Medical University of Silesia, Poland

This article reviews some of the important requirements for hip biomaterials (titanium and cobalt alloys), including their response to the body environment (biocompatibility), mechanical behaviour, wear resistance, fretting corrosion. The use of biomaterials for hip implants is one of the major focal points in this article. Background information is given on the metals used in prosthetic devices, biocompatibility, significant of corrosion and standards. Austenitic stainless steels are used for temporary applications because they have lower resistance to pitting corrosion then titanium and cobalt alloys. Titanium alloys are widely used for hip implants due to excellent combination of mechanical properties, low density and good corrosion resistance, acceptable tissue tolerance. Titanium alloys have low resistance to fretting. Fretting is a major concern at hip implants at the femoral head and neck taper interface. Wear rates of ultrahigh molecular weight polyethylene (UHMEPE) mated against Ti-6Al-4V is significantly greater than that for Co-Cr-Mo alloy. In annealed condition Ti-6Al-4V alloy has moderate strength and is not able support the surface passive layer. Total loading from body weight is high enough to initiate breakdown of the surface passive layer. The breakdown of oxide layer causes abrasive wear. The hard, sharp debris acts as third abrasive body components. Purpose of the paper is to introduce group and particular metallic materials, which are chosen for more and more often surgical hip implants. Conclusions of the paper refer to provided material information, which support important medical decisions due to hip implants and to future development of biomaterials and their heat and thermo-chemical treatments.

Relation of the exon units to the 2D and 3D protein structures

Piwowar M.1, Piwowar P.2

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

2Department of Measurements and Electronics, AGH University of Science and Technology, Poland

Exons, structural units of genes, carry information about the amino acids forming the spatial protein structures (exons encodes the protein body). Studies of relation: exons - protein structures provide information about structural relation of a different molecular levels (genes, proteins). Results show that secondary structures in proteins may not always coincide with corresponding areas of the protein encoded by the exon units (e.g. alpha-helix may be encoded by fragments of DNA from two different exons). The amino acids in polypeptides can be encoded by codons which the components are derived from different exons (e.g. two nucleotides of one exon and one of the other can encode amino acids). Exon units do not overlap with the structures at the protein level.

The interactions of beta-lactamases with their inhibitors – the ETS-NOCV analysis

Broniatowska E.1

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

Beta-lactamases are a main cause of bacterial resistance to commonly used beta-lactam antibiotics such as penicillins, cephalosporines and their derivatives. They hydrolize the beta-lactam ring present in these antibiotics’ chemical structures and thus destroy these drugs. The usage of beta-lactamases’ inhibitors allows to oppose drug destruction. The main goal of this contribution is application of quantum-chemical ETS-NOCV approach for detailed investigation of beta-lactamases’ interactions with their inhibitors. The ETS-NOCV method is the combination of Natural Orbitals for Chemical Valence (NOCV) theory with Extended Transition State (ETS) energy decomposition approach. This methodology enables deeper insight into nature of bonding between interacting moieties proved by numerous studies. The applicability of ETS-NOCV was tested on the example of two beta-lactamase complexes: with (1R)-1-(2-thienylacetylamino)-1-(3-carboxyphenyl)methylboronic acid (sm2) and with N-2-thiophen-2-yl-acetamide-boronic acid (ctb) molecules. The applied methodology determined the interaction network between enzyme and its inhibitor and confirmed almost all interactions reported by experimentalists in both cases. However, for the complex with sm2 it showed new interactions not mentioned by experimentalists like e.g. hydrogen bonds formed by Asn170 and Asn132. We also observed in this complex that the NH group of methylboronic acid is completely inactive in the molecular interactions in contrast to its carbonyl oxygen atom which forms hydrogen bond to residues Asn104 and Asn132. The ETS-NOCV method was successfully applied for diagnostics of molecular interactions in the biochemical field of science. The results of this methodology not only confirm chemical interactions present in the complex enzyme-substrate suggested by experimentalists but also reveal interactions not pointed in experimental works and provide an evidence for their existence.

Application of Data Minning methods for determination of risk factor for complication appearance after the ischaemic stroke

Stanisz A.1

1Department of Bioinformatics and Telemedicine, Jagiellonian University Medical College, Poland

There are many statistical methods for analysis of nominal (categorical data). The most popular analysis of such data is chi-squared test of independence although its results often are not sufficient for researchers. This contribution concerns application of several selected statistical analysis for determination of risk factors for complication appearance after the ischaemic stroke. The origin of these complications could be neurological, respiratory, dermatic, circulation, physiological, gastric-enteric and orthopaedic. It is very challenging and important especially for the patients to quickly diagnose complications during the first days of stroke. Analysed data are connected with the ischaemic stroke and concerns basic factors responsible for the appearing complications after stroke as gender, age groups (below 20, 20-40, 40-60 and over 60), smoking habits and presence of diabetes for 1000 examined people. The statistical methods used for calculations start from the ordinary chi-squared test of independence, through correspondence analysis up to generalized linear models such like logistic regression model, log-linear analysis and CHAID analysis. Applied statistical techniques allowed for investigation of interactions between variables and reveal many interesting details. They pointed that all chosen variables (age, gender, smoking and diabetes) are important factors for stroke complication. The increase of patient age causes the increase of complication probability. Furthermore, the gender and smoking habits have significant influence on this relation. Additionally, it turns out that diabetes has moderator effect on complication appearance. This survey corroborates that we should not limit to traditional chi-square test purely. Application of more advanced statistical methods could exhibit interesting associations between variables which are not available using simple contingency table analysis. To sum up, chi-squared statistics is only the starting point in categorical data analysis and should be complemented of further analysis like correspondence analysis, logistic regression and log-linear models.

Phylogenetic aspects of the concept of intelligent life design

Krajewski Z.1

1Department of Biosensors and Processing of Biomedical Signals, Silesian University of Technology, Poland

The paper presents a new treatment of molecular evolutionary model as a product of intelligent changes. The aim of this paper is to obtain a life design system, drawing on processes occurring in nature regardless of explanations of the origins of life. The idea of intelligent design and molecular relationship is considered as a basic concept of the intelligent life design system, using some analogies taken from molecular evolutionary models. Three steps of life design system are outlined, but the main subject is an attempt to find certain similar effects of the design system processes and the processes simulated with basic evolutionary substitution models: J-C, Felsenstein and HKY. An idea of gene reduction has been applied, from more complex (taking into account information density) biological systems to less complex, specialized biological systems. Two steps have been taken into consideration: a test stage in the virtual world and an adaptation finishing process after running the systems in the real world. Two algorithms have been applied. The first one has applied similarity related to accommodation process to required conditions in the virtual and the real world. The second algorithm has applied accommodation to required conditions separately (expressed as amino acid substitution) in the first step, using a convenient criterion, and further (similar to observable) accommodation in the real world. A phylogenetic tree, similar to a real one, has been calculated using the above method for mammals, for mtDNA, with the ML method and with the aid of PhyML for the HKY model. The paper is an introduction showing an aspect of the life design system, related to phylogenetic relationships.

Rough assessment of GPU capabilities for parallel PCC-based biclustering method applied to microarray datasets

Orzechowski P.1, Boryczko K.2

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

2Department of Computer Science, AGH University of Science and Technology, Poland

Massively parallel computing architecture is proven to significantly shorten the time of computations for different clustering algorithms. Nonetheless, some limitations of the architecture impoverish the application of GPUs for the task of biclustering, in which it is intended to find local similarities within the data. This might be one of the reasons why there have been not so many biclustering algorithms proposed so far. In this article we verify if there is any potential for application of complex biclustering calculations on heterogeneous architectures. We introduce MMPC biclustering algorithm, which utilizes Pearson correlation for determining similarity between rows of the input matrix. We present two implementations of the algorithm: sequential and parallel, which is dedicated for heterogeneous environments. For the verification we used six GEO datasets with the experiments performed on Affymetrix Human Genome U133A microarray platform. The datasets had the same number of rows, but different number of columns. Two NVIDIA graphic cards were involved: Tesla M2090 in Fermi architecture and Tesla K20c in Kepler architecture. We verified the obtained speedups in order to assess if heterogeneous architecture may successfully shorten time of heavy biclustering calculations. We proved that both Fermi and Kepler architecture may successfully shorten the time of computations for the selected datasets and are suitable for performing heavy biclustering computations involving Pearson’s Correlation Coefficient (PCC). Nonetheless, its debatable if the similar gain may be achieved for less intensive computations.

Computer-based method for detecting colorblindness

Laskowski M.1, Zabrodzki J.2

1Institute of Computer Science, Lublin University of Technology, Poland

2Institute of Computer Science, Warsaw University of Technology, Poland

Colorblindness is the decreased ability to perceive colors correctly, varying from reduced spectral sensitivity of 1 of 3 retinal photoreceptors, through color vision reduced to only two dimensions, up to being completely unable to distinguish colors. Ishihara plates are the most widely used tool for detecting colorblindness, mainly due to their simplicity and low cost, although that they allow detecting only certain types of color vision disorders (CVD) and obtained results may be ambiguous. More accurate methods are less available due to their cost or complexity. Being properly diagnosed is not only important because of professional reasons but also everyday quality of life or even safety. All of these facts motivates the need for development new, low-cost and easy to use CVD detection method. This paper presents results of a pilot research on new computer-based CVD detection method, designed as an aid for screening tests on large groups of patients. Each diagnosis is based on analysis of user decisions and actions made while playing simple computer games. Each game is divided into four phases: introductory (used to explain game rules), initial (for detecting potential CVD), specifying (using diagnosis from previous phase to differentiate between dichromacy and anomalous trichromacy) and confirming (corroborating the final diagnosis). Computer games were chosen as a testing tool in order to minimize influence of external factors and to increase user involvement in the whole process. The experiment results (discussed in the paper) prove the system to be a useful tool for detecting color vision disorders, although it requires additional work on determining and corroborating the values for acceptance threshold and lie scale.

Mechanical valves in pulsatile Ventricular Assist Devices

Barcik J.1

1Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology in Kraków, Poland

Mechanically Circulatory Support therapy is realized with the use of Ventricular Assist Device (VAD) which supports the native heart. The aim of the project was to review constructions of valves used in VAD systems and perform conceptual design of a laboratory equipment to conduct valve pass/fail assessment test. Design was based on international standard ISO 5840:2005 which describes cardiac valve prosthesis. Construction of valves used in VADs systems and their operation environment are similar to cardiac valves prosthesis, therefore it is possible to adopt ISO 5840:2005 procedures to test valves used in VAD systems. Laboratory equipment was designed and selected based on literature review in the field of mock laboratory circulatory loop. Extensive state of the art done in the field of different types of valves used in MCS devices preceded the concept development of valve validation laboratory stand. Afterwards Author designed elements of laboratory stand which is able to perform a steady flow test verifying pressure drop on the valve. Additionally, laboratory stand was designed to perform test in pulsatile manner which imitates conditions of valve operation in the Ventricular Assist Device. Pulsatile test was equipped with certain modules responsible for simulation of vascular resistance and vascular compliance. Concepts of simulation cardiac conditions in laboratory environment presented in the project might be implemented during designing of mock circulatory loops which are necessary element of prove test for such devices as cardiac valve prosthesis, VADs system or even Total Artificial Heart. Additionally proposed solutions can be used in laboratory trials to observe different cardiovascular phenomena.

On scientific research using Scalarm Platform for modeling and simulation

Liput J.1, Król D.2, Słota R.3, Kitowski J.3

1Academic Computer Center AGH Cyfronet, Poland

2Information Sciences Institute, University of Southern California, USA

3Department of Computer Science, AGH University of Science and Technology, Poland

Discoveries in many fields of science, e.g. bioinformatics, are increasingly attained through large-scale experiments based on computer simulation. Typically, a simulation is repeated with different sets of input parameters to study correlation between input data and the results. Such experiments often use High Throughput Computing (HTC) resources to harness embarrassingly parallel nature of the experiments. This type of research, usually referred to as parameter studies, may be supported by resource management systems. However, they do not provide any way of intelligent input data selection nor integration with data analysis tools, hence scientists has to manually perform mundane work, e.g. simulation scheduling and results gathering. We argue that using a dedicated tool – Scalarm – decreases the time necessary to conduct such research by integrating computations on HTC resources with data preparation and analysis. Scalarm supports an extended variant of parameter studies, which involves Design of Experiments and data analysis steps to facilitate parameter space exploration and minimize the number of executed simulations. User starts an experiment by uploading a simulation model. Then, a parameter space to explore is specified. Next, the simulation model is executed for each element from the parameter space on HTC resources. Finally, results are gathered, visualized and analyzed, which often lead to parameter space expansion. Scalarm has been successfully applied to study problems from molecular dynamics, metallurgy and behavioural studies. In each case it has accelerated research by enabling scientists to focus on solving problems rather than managing resources.

Acknowledgement: Supported by POIG.02.03.00-12-137/13.

The analysis of imaging capabilities for static layouts of scatterers using Doppler tomography method

Świetlik T.1, Opieliński K.1

1Faculty of Electronics, Wrocław University of Technology, Poland

The pulse echo method is most often used in diagnostic methods based on ultrasonic imaging of tissue. It is also possible to specifically use for this purpose the Doppler effect and a continuous ultrasonic wave of a much higher energy. This method is called Doppler tomography (DT), or continuous wave ultrasound tomography (CWUT). In this case, tomographic image reconstruction is based on recording of Doppler frequencies generated by the movement of ultrasonic probe (consisting of a sending and a receiving transducer) relative to static scatterers contained in the structure of the examined object. Doppler tomography is a special development of a well-known ultrasonic Doppler method used for imaging of blood flow or physiologic motion of tissues. Currently, there is a lack of comprehensive research on DT, in particular the simulations and measurements of the structure of biological media for analyzing applicability of this method in medical diagnostics, e.g. in bone examination. As a part of this study, a research set-up has been developed allowing the registration of Doppler signals resulting from the scattering of continuous ultrasonic wave by the arrangement of wires rotating in water. Also, a method to simulate Doppler frequencies for different layouts of point scatterers were developed along with the method of tomographic reconstruction, which allows imaging of these layouts. This allowed a preliminary analysis of the quality and resolution of DT imaging depending on the relative positions of scatterers, the quantization of rotation angles and Doppler frequency spectrum. The results indicate that DT allows imaging of the layout of scatterers, whose size is comparable to the length of the applied ultrasonic wave.

Omnidirectional B-mode ultrasound imaging of breast tissue

Opieliński K.1

1Faculty of Electronics, Wrocław University of Technology, Poland

Breast cancer is now considered one of the major threats to public health among women. As a complement to X-ray mammography, conventional ultrasound imaging (UI) is currently used. However, it has limited resolution and depends on the operator, therefore it is not reliable enough for screening. Leading B-mode ultrasound scanner manufacturers currently launch expensive devices which enable to automatically scan the breast in vivo by the long linear array with the shape adapted to a female breast in the supine position, coupled using a gel and moved along the breast for several different angles of inclination. This paper presents a novel concept of two different methods of high-resolution omnidirectional B-mode ultrasound imaging of breast tissue in vivo, which are automatic and non-contact: full angle spatial compound imaging (FASCI) and omnidirectional ultrasound imaging (OUI). The essence of both methods is the use of a specially designed circular array of 1024 ultrasonic transducers surrounding the breast immersed in water. FASCI method involves compounding ultrasound image of each individual coronal breast section from partially overlapping fragments obtained around the breast from 32 sectors of the circular matrix. OUI involves direct imaging of whole coronal breast section by activating and switching the elementary transducers in the same manner as in conventional UI, but with the use of the circular array. FASCI method was tested using B-mode ultrasonic scanners with a sector probe and a linear array, which were rotated in water around the biopsy breast phantom. The resulting FASCI images of the phantom structures have better quality and much higher resolution in comparison to the conventional ultrasound images.

Influence of neural network structure on its performance in solving chosen medical problem

Smyczyńska U.1

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

It is known that learning outcome in artificial neural networks (ANN) is influenced by their structure, particularly number of neurons. Thus, it was decided to investigate this effect on the example of multilayer perceptrons (MLPs) trained to predict final height (FH) of patients treated due to growth hormone deficiency (GHD). The dataset used in the experiment consisted of 150 training, 70 validation and 69 testing cases. All of them contained values of 8 different parameters that seem important for prediction of FH. Those data were used to train MLPs with single hidden layer (HL) containing from 1 to 3000 neurons and MLP without HL. The learning was stopped when no improvement of accuracy in validation set was observed for several tens of epochs. For each ANN size the procedure was repeated 10 times with different initializations. It was observed that the problem can be solved almost equally well (with average FH error about 4.2 cm in testing) by all networks containing from 0 to 100 hidden neurons (HN). All those ANN performed similarly in training and validation, while slightly worse in testing. Further increase in HN number resulted in observable worsening of prediction quality, especially in validation and testing. The problem of predicting FH in patients treated due to GHD seems to be relatively uncomplicated since it can be solved reasonably well even by MLP without hidden layer. However, when number of HN is increased, our dataset becomes too small to provide enough information for calculation of all weight coefficients accurately. Moreover, large MLPs show the tendency to learn training data rather than find a true dependence.

Application of S-transform to signal analysis

Szymczyk P.1, Szymczyk M.1

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

The paper presents opportunities of signal analysis using S-transform. Images of the module and the angle of S-transform shown the results and may be used as models to compare with unknown signals and detect patterns and anomalies, even in very sophisticated signals. S-transform was defined by R. Stockwell. It is used, a many others, to investigate quality of energy wire, data from GPR etc. Its characteristics may be also used to examine other signals, including medical ones, like ECG. Utilizing S-transform, we can determine some of signals’ properties. It is especially well-suited because it allows to pinpoint the property in term of time and frequency. Other interesting and useful characteristics of S-transform are: independence of response’s amplitude to frequency of signals, progressive resolution and information about the signal phase. This paper presents definition and various examples of computing S-transform. The paper also coversa concept of using S-transform to investigate anomalies in signals.

Application of Levenberg–Marquardt algorithm for engagement detection in electroencephalographic time-series

Wójcik P.1, Wójcik G.2

1Computer Science, Polish-Japanese Academy of Information Technology, Poland

2Faculty of Transport and Computer Science, University of Economics and Innovation in Lublin, Poland

Even though the electroencephalographic techniques have been known for a century it is nowadays that low-cost electroencephalographs start to appear on the market as entertainment devices or simple laboratory equipment. One of such devices is popular EPOC neuroheadset by Emotiv. It is dedicated to construct simple BCIs or for the detection of a few cognitive states like: engagement, frustration, meditation, excitement and long time excitement. Equipped with 14 channels and gyroscope it can gather EEG signals and head position with satisfactory quality. However, the device is covered by some mysteries like closed-source of an algorithm used for cognitive states computation. It is only speculated on Internet forums that the fractal analysis is used to get the output for the so-called cognitive suite. The aim of research presented in this article was to apply a kind of reversed engineering for engagement detection using artificial neural networks. We applied Neural Networks Toolbox for Matlab for a data set collected from one adult student during mathematical problems solving. The network was built of static perceptron with one hidden layer containing 15 neurons and trained using the Levenberg–Marquardt algorithm. The results we obtained are satisfactory. The network approximated function with mean squared error equal to 7e-04 and regression value of correlation equal to 0.97. We hypothesize that our neural network can discover the algorithm for engagement detection used by Emotiv. By manipulating its parameters we will be able to tell which channels are more or less important for engagement and compare our experiments with well-known results based on band-detection for the engagement and workload computation.

Independent component analysis of eeg data for EGI System

Gajos A.1, Wójcik G.1

1Institute of Computer Science, Maria Curie-Skłodowska University, Poland

Component analysis is one of the most important methods used for EEG signal decomposition and so-called Independent Component Analysis (ICA) is commonly used. The main function of the ICA algorithm is to find a linear representation of non-Gaussian data which elements are statistically independent or at least as independent as possible. There are many commercial solutions for EEG signal acquisition. Usually together with the electroencephalograph one gets dedicated software to handle the signal. However, quite often the software does not provide researchers with all necessary functions. High performance, dense-array EGI-EEG System is distributed with NetStation software. Although NetStation is a powerful tool it does not have any implementation of the ICA algorithm. This causes many problems for researchers who want to export raw data from the amplifier and then work on them using some other tools like EEGLAB for Matlab as this data is not fully compatible with the EGI format. We will present the C++ implementation of ICA that can handle filtered data from the EGI with better affordability. Our tool offers visualization of raw signal and ICA algorithm results and will be distributed on General Public License.

Simulation of cochlear implant patient’s hearing

Walkowiak A.1, Lorens A.1, Kostek B.2, Skarżyński H.3

1Department of Implants and Auditory Perception, World Hearing Center of the Institute of Physiology and Pathology of Hearing, Poland

2Department of Multimedia Systems, Gdansk University of Technology, Poland

3World Hearing Center of the Institute of Physiology and Pathology of Hearing, Poland

The aim of the study was to develop and to validate computer simulation of cochlear implant patients’ hearing with spread of excitation as a parameter. Acoustic probes from the developed in IFPS simulation were presented in free field condition to 25 volunteers (13 female and 12 male, aged from 21 to 38 years) with normal hearing threshold. Discrimination score of the probes for three spread of excitation (SoE) width values was assessed. Despite of big variability of the results in each spread of excitation width group, ANOVA test results showed that correlation between simulated width of SoE and monosyllabic words discrimination scores was statistically significant (p<0,001). Conclusions: Thanks to objective measurements of auditory pathway of implanted patients it is possible to develop simulation of “electric hearing”, which could explain differences between speech discrimination scores from patient to patient.

Using mathematical modeling to find possible interactions between NF- κB

Kardyńska M.1, Śmieja J.1

1Institute of Automatic Control, Silesian University of Technology, Poland

NF-κB is a family of transcription factors that regulate transcription of hundreds of genes, including genes coding anti-apoptocic factors. Inhibition of the NF-κB pathway could improve anticancer therapy efficacy and can be done by means of hyperthermia and HSF1-dependent pathway. However, the precise mechanism of this suppression has not been discovered yet. In our work we used mathematical modeling to test possible hypotheses about the mechanism of the NF-κB pathway suppression. Since these pathways have been modeled separately so far, we chose two published separate models and merged them changing some of their components when necessary. Based on the literature sources and the results of our own experiments we tested numerically which of hypothetical interactions between those two pathways allow to reflect experimental results. We found three possible mechanisms which could mediate NF-κB pathway inhibition through a heat shock: 1) competition for the IKK protein, 2) inhibition of proteins activating cellular response upstream of IKK activation, 3) inhibition of the NF-κB import into the nucleus. Obtained numerical results indicate that the mechanism behind the NF-κB suppression may be also a combination of the above-mentioned. In silico analysis has allowed us to find three plausible mechanisms of the NF-κB pathway suppression and simultaneously made it possible to exclude some of the mechanisms known from the literature. Selecting proper mechanism still requires confronting numerical and experimental results but performed analysis made it possible to limit the scope of experimental research needed.

Acknowledgement: The work has been supported by the NCN grant DEC-2012/05/B/NZ2/01618.

Acquisition of individual characteristics of plantar pressure distribution images of the foot

Dzienniak D.1, Cieślik J.1

1Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Poland

Plantar pressure distribution images (foot pressure images, plantocontourograms) are obtained from pedobarographs based on pressure values read by sensors. Images can be analyzed and used for identifying or diagnosing biomechanical and neuropathic foot disorders as well as abnormalities. Different methods of digital analysis are used to recognize symptoms on the basis of plantar pressure distribution. Fourier-based and wavelet-based algorithms can greatly facilitate the process of extracting and analyzing important data from images. Wavelets are a fast developing tool in signal analysis, whether it be analysis of one or two-dimensional signals. The main purpose is to process and analyze image data effectively, that early diagnosis can significantly improve treatment efficacy. There have been an increasing number of humans suffering from a condition referred to as diabetic foot. A many of them had their feet or legs amputated, primarily because of ulceration or infections caused by the aforementioned condition as a consequence of diabetes. Diabetes contributes to approximately 80% of the 120,000 nontraumatic amputations performed yearly in the United States alone. Some of amputations could have been avoided had the patients been correctly diagnosed and treated accordingly. The present paper aims at discussing a Fourier–wavelet approach to image data analysis. It begins with a plantocontourogram generated from data recorded by the PEL-38 Electronic Podometer. Then it provides a brief description of an algorithm that prepares the image for further analysis. Finally, it presents Fourier-based and wavelet-based methods that can help to process and analyze the information and characteristics extracted from the image.

Application of CAE systems for medical diagnostics and surgical operations planning

Cieślik J.1

1Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Poland

An important part of the treatment process prior to medical treatment is to plan operations based on the results of medical diagnostic imaging. The most important is medical knowledge but more often is required to use engineering knowledge. Comprehensive programs for image processing give the opportunity to create a three-dimensional model based on scanned medical images of ultrasound (USG), computerized tomography (CT) or magnetic resonance imaging (MRI). Planar images of the body sections are processed by computer systems into the spatial three-dimensional full virtual models. The human body, inside which access was previously only possible using surgical tools can be subjected to further noninvasive analysis and computer simulation using methods used previously in the analysis of engineering materials and structural systems CAE (Computer Aided Engineering). It is possible to accurately measure distances, angles, shapes and volumes, to make the simulation of tissues, surgical instruments and plan the various stages of the operation. The latter process - preoperative planning is particularly important for stereotactic or laparoscopic surgery conducted inside the patient’s body. An example is the pre-operative planning of the trajectory multi-link surgical tools. When planning operations surgeons previously often perform computer simulations, through which they can perform maneuvers in a way that they are least difficult for both the patient and the operator. They have at their disposal not only virtual models of actual patient, but also the actual structure of the models obtained using virtual prototyping. The paper gives examples of the analysis of biological structures and operations planning using CAE systems.

Automated globules detection and counting in skin lesion images

Nowak L.1, Grzesiak-Kopeć K.1, Ogorzałek M.1

1Department of Information Technologies, Jagiellonian University, Poland

In this paper we present method for detection and measurement of melanin globules visible in dermoscopic skin images by performing analysis similar to the one done in dermoscopic evaluation by dermatologist. This method uses image analysis to select objects present in the dermoscopic image that match pattern of globule structure. Classification of found objects is made based on shape, size and color model of globule structure. This classification is difficult task due to color and scale differences between dermatologic images. These differences are related to differences between equipment used in acquisition process and lighting conditions. This paper describes characteristic of globule structure needed for correct classification. We present a method for calculation globule characteristic as well. Presented method is part of computer-aided diagnostic process of melanocytic skin lesions. Evaluation of such lesions is a basis for early detection of malignant lesions.

Signal processing methods in analysis of the uterine contractions activity in women in pregnancy

Oczeretko E.1, Borowska M.1, Brzozowska E.1

1Department of Materials and Biomedical Engineering, Bialystok University of Technology, Poland

In pregnancy, the study of the uterine contractions is a powerful tool in diagnostics of the premature birth. Preterm labor remains a major cause of neonatal mortality. Among the many methods of recording uterine contractility the most important from diagnostic point of view seems to be electrohysterography (EHG) - the recording of changes in electrical potential associated with contraction of the uterine muscle. There are controversies if EHG may identify patients with high risk of preterm delivery. There is a need to check various digital signal processing techniques (linear and non-linear) to describe the recorded signals. In patients at risk of preterm birth are also carried out biochemical studies of serum concentrations of various types of chemokines. Linear signal processing techniques are based mainly on changes in the frequency of the power spectrum (fast Fourier transform, wavelet analysis, linear autoregressive modeling). Among the applications of nonlinear dynamics methods are: approximate entropy, sample entropy, correlation dimension, Lempel-Ziv complexity measure. EHG is a multi-channel measurement method, so it is important evaluate the synchronization of the signals, propagation of the signals, the similarity of the signals or the similarity of their dynamics. There are applicable among others: Hilbert transform, cross-correlation, coherence, the mutual correlation dimension, cross-approximate entropy, mutual information and the nonlinear interdependencies. By combining the results of digital signal processing with biochemical studies we can better understand the different pathologies of uterine contractile activity.

Acknowledgement: This paper was supported by research project no. MB/WM/3/2014 financial resources to help the development of young researchers and doctoral students from the Bialystok University of Technology.

Telemetry recording of the reproductive tract in pigs - investigation of various phases of ovarian cycle

Brzozowska E.1, Pawliński B.2, Oczeretko E.1, Gajewski Z.2

1Department of Materials and Biomedical Engineering, Bialystok University of Technology, Poland

2Department of Large Animal Diseases with Clinic, Warsaw University of Life Sciences, Poland

Analysis of bioelectrical activity of animals reproductive tract, which is directly related to mechanical activity of the myometrium is important in understanding the physiology of pregnancy and labor. Electromyographic (EMG) measurements have been carried out with the use of telemetric (biotelemetric) methods. Pigs were surgically fitted with 3-channel telemetry implants positioned between the abdominal muscles, and three silicone electrodes which were sewn on the left horn of the uterus, the uterine body, and right horn of the uterus respectively, with 18 cm spaces between them. The studies were approved by the Local Ethical Committee. Changes in the characteristics of recorded EMG signals can be observed during the whole ovarian cycle. Luteal phase has numerous single action potential spikes with lack of action potential periods between them. The proestrus characterizes with increasing EMG amplitude of the isthmus, while the EMG amplitude of the bulbs of the oviduct decreases. By contrast, during estrus the occurrence of short spikes with high frequency and amplitude were registered. In the ovulation phase, there is strong long-time agitation connected with lower frequency. This significant force of excitations, suddenly decreases after estrus to the proestrus level. Fourier analysis was performed for signals representing contractions of myometrium obtained from EMG signals. For representative signals, the mean values of dominant frequencies were .0019 Hz for luteal phase, .0034 Hz for proestrus and .0055 Hz for estrus. In the animal studies, wireless monitoring increases their comfort. This allows for long signals recording and for better cognition and understanding of the involved physiological processes.

Model based on various Data Mining methods for prediction of infertility treatment outcome

Jankowska D.1, Cwalina U.1, Milewska A.1, Citko D.1, Więsak T.2, Morgan A.3, Milewski R.1

1Department of Statistics and Medical Informatics, Medical University of Bialystok, Poland

2Department of Gamete and Embryo Biology, Institute of Animal Reproduction and Food Research of Polish Academy of Sciences in Olsztyn, Poland

3Shore Institute for Reproductive Medicine, USA

Nowadays, more than 10% couples have problems with conception. Depending on the causes of infertility there are various methods of treatment. One of them is the intrauterine insemination (IUI). Medical databases used to assess the effectiveness of treatment contain the large number of demographic and clinical variables. In such a case data mining methods are often the only effective way of proceeding. The aim of the study was to create the algorithm for the treatment outcome prediction based on various data mining methods. The analyzed dataset contains information about 824 IUI treatment cycles carried in the Shore Institute for Reproductive Medicine in Lakewood, USA. The cases were repeatedly clustered using K-means and Kohonen Neural Networks methods. There were chosen several divisions due to the most different (between clusters) rate of pregnancies. Finally the model with a relatively highest predictive power was build. Results were summarized by Basket Analysis. To confirm the predictive power, the ROC analysis was performed. The applied methods have allowed to create clusters, with significantly different percentages of treatment success between them. The appropriate combination of selected divisions has permitted to divide cases into several groups, from the smallest (3.2%) to the highest (21.7%) pregnancy rate. The area under the ROC curve for the created predictive model was AUC=0.63. To conclude: large medical databases often contain hidden information, important for the assessment of the treatment success but difficult to discover by conventional statistical methods. The created model shows that a combination of different Data Mining methods allows to uncover this information.

Toward computer-aided analysis of data coming from evaluation sheets of subjects with autism spectrum disorders

Pancerz K.1, Derkacz A.1, Mich O.1, Gomula J.2

1University of Management and Administration in Zamosc, Poland

2Cardinal Stefan Wyszynski University, Poland

The aim of the research is to adapt computational intelligence methods, with particular regard to Data Mining and machine learning ones, for computer aided-decision support in diagnosis and therapy of persons with autism spectrum disorders (ASDs). Computer-aided analysis of data coming from evaluation sheets of subjects with ASDs in the important spheres (among others, self-service, communication, cognitive, functioning in the social and family environment, physical, etc.) enables us to determine trends in the abovementioned spheres (progress, stagnation, or regress) and support adjustments of the individual therapeutic and educational programs for persons covered by the care. Adapted methods will be implemented in the specialized computer tool. Experiments testing the relative effectiveness of different methods have been performed on data describing over 70 cases (subjects) classified into three categories: high-functioning, medium-functioning, or low-functioning autism. Each subject has been evaluated using an original sheet including questions about competencies grouped into 17 spheres.

A Systems biology approach to melanogenesis. Tyrosinase subsystem and eu/pheomelanin synthesis

Matuszak Z.1

1Faculty of Physics and Applied Computer Science, AGH University of Science and Technology, Poland

A comeback to holistic view of organism is observed in biomedical sciences in last years. Development of computer sciences enabled successful approach towards simulation of real biological systems. This general approach is known as Systems Biology (SB). Melanogenesis, the process of melanin pigment synthesis taking place in melanocytes, appears as a good object of study using SB approach due to a relatively good knowledge of its biochemistry and regulatory biology. A dynamical model of interaction between two subsystems playing fundamental role in melanogenesis: tyrosinase subsystem and thiols dependent regulating switching subsystem, was developed in the paper. The model was used to explain some experimentally observed phenomena: lag-period and interdependence of two kinds of melanins. Concentration-dependent relations were determined from experimental data. Simulations were performed using CellDesigner and COPASI. Computer simulations were conducted for various concentrations of basic reagents, mono- and diphenoles, enzymes and thiols. It was stated that: (a) activity of tyrosinase depends on the tyrosine and quinone concentrations changing the so-called tyrosinase lag-period, (b) the rate of pheomelanin production is strongly influenced by the concentration of thiols and the initial concentration of tyrosine, (c) the switch between the dominating kind of final product is triggered by thiols concentrations. Simulation results confirmed the results of experiments: the existence of switching mechanism in production of eu- and pheomelanin. The production of eumelanin really starts to dominate in melanin synthesis after the exhaustion of available thiols.

Monte Carlo simulation of fluorescence based photosensitizer dosimetry in pigmented and non-pigmented tissues

Matuszak Z.1

1Faculty of Physics and Applied Computer Science, AGH University of Science and Technology, Poland

Photodynamic Therapy (PDT) is used in treatment of cancer. After injections of photosensitizer (PS) that lead to accumulation of PS in the cancerous tissue, a light of proper wavelength is applied onto the area of tissue to be treated. Success of PDT is dependent on the accumulation of the PS within the tissue at sufficient concentration and therefore determination of tissue PS concentration is a crucial problem in PDT. Many of PS are fluorophores, so measurement of fluorescence in vivo is used to quantify the concentration of PS. PS dosimetry for PDT is complex, because the distribution of excitations/emitted photons depends on tissue optical properties. An approach to investigation of fluorescence of PS is a Monte Carlo (MC) simulation method. In the paper the MC method (MCML, MCVM ) was used to quantify spatial distributions of excitations/emitted photons in both non-pigmented/pigmented tumour tissues. Optical properties of tissue are based on reported values. Model of skin consists of seven layer. The MC simulations were performed for PS from porphyrin family. Volumetric distributions of both, fluorescence excitation and emission were obtained for different PS and melanin concentrations. It was demonstrated that: (1) emitted fluorescence intensity is a function of the depth of fluorescence generation, (2) strong influence of melanin content in various layers on the intensity of PS emission light. For excitation light diffuse reflectance was calculated and compared with skin appearance. It was demonstrated that the seven layer tissue model is able to predict the spatial distribution of the porphyrin type PS fluorescence excitation/emission pattern within skin as a functions of PS and melanin concentrations.

Reconstruction of hit position and hit time of gamma quanta in the J-PET detector based on the library of averaged model signals

Gupta N.1

1Faculty of Physics Astronomy and Applied Computer Science, Jagiellonian University, Poland

The J-PET detector [1-6] being developed at Jagiellonian University is a Positron Emission Tomograph composed of long strips of polymer scintillators. One among the major goals of present research of the J-PET collaboration is to find an optimal reconstruction method to estimate gamma quanta hit time and hit position which would allow to make use of the very good time resolution achievable with plastic scintillators. To this end we develop a novel reconstruction method based on the comparison of examined signal with the model signals stored in the library. As a measure of the similarity we use a Mahalanobis distance [7]. The proposed method takes advantage of the fact that the amplitude and shape of the registered signal change with the hit position significantly in long scintillator detectors. The preliminary results and application of this method will be discussed and presented.

[1] P. Moskal et al., Nuclear Medicine Review 15 (2012) C68.

[2] P. Moskal, ..., N. G. Sharma, ... N. Zoń, Radiotheraphy and Oncology 110 (2014) S69.

[3] L. Raczyński, …, N. G. Sharma, Nucl. Instr. and Meth. A 764 (2014) 186-192.

[4] P. Moskal, Sz. Niedźwiecki.....N. G. Sharma, Nucl. Instr. and Meth. A 764 (2014) 317-321.

[5] P. Moskal, N. Zoń..... N. G. Sharma, Nucl. Instr. and Meth. A 775 (2015) 54-62.

[6] N. G. Sharma, P. Moskal....., arXiv:1502.07886.

[7] P. C. Mahalanobis, Proceedings of the National Institute of Sciences of India 2 (1936) 4955.

Cybernetic modelling of human vision in a robot swarm – a distributed cognitive approach

Podpora M.1, Kawala-Janik A.1

1Faculty of Electrical Engineering, Automatic Control and Informatics, Opole University of Technology, Poland

Vision is one of the most important senses for human beings. No surprise that robots also use this source, although their efficiency of acquiring knowledge from image is much lower (yet). Therefore, there are many hardware tweaks to help robots in better understanding of the surrounding, but software object recognition procedures require a fair background and object visibility, so sometimes the object must be moved, angle/lighting changed, etc. In some cases this is difficult or not possible. The authors propose the accelerate vision understanding by using Cooperative Vision (CoV), i.e. to get video input from cooperating robots and process it in a centralized system. To verify the efficiency of image understanding procedures, a comparative study is currently being conducted, involving a classical single-camera Computer Vision (CV) mobile robot and two (or more) single-camera CV robots cooperating in CoV mode. The CoV system is being designed and implemented so that the algorithm would be able to utilize multiple video sources for recognition of objects on the very same scene. The CV system, on the other hand, was implemented in a similar way – using client-server architecture, so that all the problematic issues (network traffic, video compression, etc.) would be similar, to enable the possibility of reliable comparison. The outcome is not just the possibility to recognize objects faster by using more cameras. The most important consequence of implementing the proposed CoV is the new possibility of seamless cooperation of robots by using their CV systems to generate common knowledge about the scene. CoV enables true robot interaction, just like we use two hands in our everyday tasks. The implementation potential is huge.

Brain activity measurement system for the purpose of brain-computer interaction

Schneider P.1, Kawala-Janik A.1, Podpora M.1

1Faculty of Electrical Engineering, Automatic Control and Informatics, Opole University of Technology, Poland

In this work innovative approach in both measurement and analysis of brain electric potentials was in brief presented. The proposed solution may play a significant role in design and development of non-invasive, EEG-based Brain-Computer Interfaces. The carried out study is a first stage of the whole research process of implementation of portable, inexpensive, easily available on the open market EEG-headsets. The research was conducted at the Opole University of Technology and the processing of the gathered data as carried out in MATLAB with an dedicated BCILAB add-on. For the current research phase only few healthy adults were tested. The main aim of this stage was to examine the concentrations level of the tested subjects. The experiments relied on repeating the same task in various conditions. The results were promising and proved that the environmental conditions strongly affected the signals comparison results. This was made for the purpose of developing a BCI system working in various environments (including those nosy). Most of the currently developed BCI systems are tested in labs only, while little testing is done in similar to real life conditions. This phase concentrated mostly on concentration of the tested subjects in order to make the future system reliable in both various environmental conditions and for patients in various mental state, so the inter alia tiredness would not affect the efficiency.

The effect of increasing doses of amitriptyline on cardiomyocytes’ electrophysiology

Tylutki Z.1, Jornil J.2, Polak S.1

1Department of Social Pharmacy, Jagiellonian University Medical College, Poland

2Department of Forensic Medicine, Aarhus University, Denmark

Overdoses of tricyclic antidepressants may lead to ventricular arrhythmia. The aim of the study was to simulate the effect of increasing concentrations of amitriptyline (AMI) and its main metabolite, nortriptyline (NOR), on the action potential of human ventricular cell. Simulations of cardiomyocytes’ action potential were performed in Cardiac Safety Simulator (CSS) with the use of O’Hara-Rudy model. CSS input data included IC50 values (μM) describing drug concentrations responsible for the half of maximal ionic current inhibition, i.e. 3.75 [1], 1.801 [2], 2.4 [3], for ICa(L), IKr, INa currents, respectively. Individual concentrations of AMI together with NOR were simulated in Simcyp (V14) for 10 healthy volunteers aged 20-50 years. Nine single doses (mg) were tested: 5, 10, 50, 100, 300, 500, 1000, 5000, 10000. The endpoints of simulations were: action potential duration at 50% repolarization (APD50), action potential duration at 90% repolarization (APD90), triangulation (APD90-APD50), and the difference between APD90 in tmax and APD90 at baseline (ΔAPD90). Mean APD90, triangulation and ΔAPD90 increase with the drug concentrations. Lack of statistically significant ΔAPD90 was confirmed for therapeutic doses using t-Welch test (p>0.05). ΔAPD90 was statistically significant in both tmax AMI and tmax NOR, for doses up from 1000mg. Early afterdepolarizations (EADs) were observed in all virtual individuals given 10000mg AMI. In silico simulations can predict the consequences of various doses of AMI on the single cardiac myocytes. Repolarization abnormalities were not expected for the therapeutic doses. EADs may be observed in case of very high-doses of AMI.

Acknowledgements: Project financed by the National Science Centre, Poland, project number 2014/13/N/NZ7/00254.

[1] Park, Kyu Sang, et al. “Fluoxetine inhibits L-type Ca2+ and transient outward K+ currents in rat ventricular myocytes.” Yonsei medical journal 40.2 (1999): 144-151.

[2] Guo, Liang, and Heather Guthrie. “Automated electrophysiology in the preclinical evaluation of drugs for potential QT prolongation.” Journal of pharmacological and toxicological methods 52.1 (2005): 123-135.

[3] Harmer, A. R., et al. “Optimisation and validation of a medium-throughput electrophysiology-based hNav1. 5 assay using IonWorks™.” Journal of pharmacological and toxicological methods 57.1 (2008): 30-41.

Dosimetry verification of treatment plans in a dynamic techniques in radiotherapy

Chmiel A.1, Wasilewska-Radwańska M.2, Dankowska A.1, Wełnogórska K.2

1Department of Children’s and Adults Radiotherapy, University Children’s Hospital of Cracow, Poland

2Faculty of Physics and Applied Computer Science, AGH University of Science and Technology, Poland

Radiotherapy requires preparing a dose simulation taking into account the location of the tumour and critical organs. In conventional 3D radiotherapy the in-vivo dosimetry method is applied to control the real dose. In recent years dynamic methods are increasingly used, e.g. Volumetric Modulated Arc Therapy (VMAT). This method is characterized by rapid changes in dose distribution across the irradiated tissue. That is why we cannot verify the dose using in-vivo method. Experimental verification of the planned dose distribution in dynamic techniques is carried out by determining the γ index on phantoms. The γ index connects a percentage acceptance criterion between the measured and received dose values from the treatment plan data and distance criterion. The goal of this work was verification of treatment dose distribution in VMAT method assuming γ index minimization. The treatment plans used in the research have been prepared in Monaco 5.0 planning system. The measurements were performed applying an accelerator Elekta Synergy with multileaf collimator (MLC) Agility (160 leafs). This linac uses a photon beam with a 6 MV energy. For dose measurements ArcCHECK, models 1220 cylindrical water-equivalent phantom with a three-dimensional array of SunPoint® Diode Detectors arranged in a spiral pattern with 10mm sensor spacing were used. Results were analyzed using SNC PatientTM software, considering four acceptance criteria: 3%3mm, 3%2mm, 2%3mm, 2%2mm. In this analysis no less than 95% compliance of the dose measured on the phantom with the measured dose were obtained for the first criterion (3%3mm) in all measured plans. With the next criteria 95% compatibility in a smaller number of measured plans was obtained.

Mining rules of diagnostic procedures for hypothesis testing problem

Walczak A.1, Winciorek E.2

1Institute of Computer and Information Systems, Military University of Technology, Poland

2Faculty of Cybernetics, Military University of Technology, Poland

In this paper we introduce the methodology showing how to incorporate statistical tools, which can be used to assess the possible effect of a treatment on subjects in reference to vague data stored into medical datasets, into CDSS. In particular we present the fuzzy generalization of the McNemar test widely used in clinical trials. This very binary test is used to determine the effect of treatment on a particular disease if the necessary assumptions about normal distribution of data cannot be made. In this paper we consider the situation, where the binary classification cannot be perform uniquely. Such problems are very common in clinical trials when missing or vague data should be constantly beer in mind during the whole research. To overcome this problem, using rough set theory we will define the lower and the upper approximation of possible class of outcomes. Let us consider a case-control study with paired binary response data where binary classification cannot be obviously performed for all elements. Moreover, we let one element of a data pair have a missing value. We will treat our data as a fuzzy equivalent of the original sample for a McNemar test. To construct a straightforward generalization of McNemar’s test we have to find fuzzy counterparts of test statistic. However, in the case of vague and imprecise data we should not expect that those objects would be integer numbers. It is quite natural that imprecise and incomplete input will result in a fuzzy output, however, the question is how to deal with such imprecision and at the end obtain an accurate decision. We discuss the approximation of test statistic in terms of IF-sets logic.

Evaluation of hemodialysis catheters by sem and impedimetric analysis

Paradowska E.1, Nycz M.1, Krasicka-Cydzik E.1, Kudliński B.1

1Department of Biomedical Engineering, University of Zielona Góra, Poland

The use of central vascular catheters in hemodialysis treatment is often linked with infectious and thrombotic complications. They are the result of the biofilm formed on the inner surface of the catheter by proteins and bacteria. The aim of this study was to evaluate the inner surface of new catheters (Mahurkar Maxid type) and catheters (Palindrome and Mahurkar Maxid types) removed from the long-term hemodialysis patients by the impedimetric and SEM/EDS methods. This analysis allowed to determine the most threatened parts of the catheters and to monitor the film forming process in the patient-dialyzer model stand carried out in vitro in PBS/albumin solution. The obtained impedance characteristics combined with SEM images gave the picture of changes occurring on the surfaces of catheters. The SEM/EDS analysis revealed that the tips of catheters after implantation were covered by the multilayer biofilm formed by numerous proteins, blood cells and crystals. Fine precipitates made of compounds carbon (C), sodium (Na), oxygen (O), potassium (K), chromium (Cr), barium (Ba), phosphorous (P), arsenic (As), calcium (Ca), chlorine (Cl), and silicon (Si) were observed on the inner surface of catheter. Micrographs of the new catheter showed that the surface has numerous cracks which is favourable for the film deposition. The obtained impedance spectra correlate well with the observed microstructural changes. This provides the chance of early detection of the biofilm formation which prompts to the antibiotic therapy as quickly as possible.

Influences of the p53 abnormalities on the apoptotic decision in the non-small cell lung cancer line: A549, H1355 and H157

Ochab M.1, Puszyński K.1

1Institute of Automatic Control, Silesian University of Technology, Poland

Non-small cell lung carcinomas (NSCLC) are relatively insensitive to chemotherapy and thus they are one of the most lethal cancers. Previous biological research shows abnormalities in p53-depenedent apoptotic pathway in NSCLC. Cell line H157 does not express PTEN and p53 is mutated in few codons. In H1355 protein p53 is mutated in DNA binding domain but PTEN functions properly. A549 has got wild type p53 and wild type PTEN. To explore how this abnormalities influence the dynamics of the p53 signaling pathway and therefore the apoptotic decision we created mathematical model of normal and cancer cell lines. The model contains several modules, including negative and positive feedback loops and proteins from Bcl-2 family on mitochondrial membrane. In line H157 we take into account lack of PTEN, which destroys positive feedback. In H157 and H1355 we include different mutation in p53, which disrupts proper p53 activity. Our model explains dynamic of protein changes after stress induction and clarifies differences in ability to activate apoptosis between cancer and normal cell lines. Result shows high active AKT level in H157 cells and low level in A549 and H1355 cells, what is consistent with biological results. Reduction of the high AKT level in H157 cell line is sufficient to induce apoptosis and sensitizes cells for additional stress factors. In cell lines with wild type PTEN, the crucial element in apoptosis is p53 level and its ability to action as transcription factor. In our work we clarify cause and effect relationship between protein levels and its functionality and sensitivity to stress factors. This project was funded by the Polish National Centre for Science granted by decision number DEC-2012/05/D/ST7/02072.

Software for Open Educational Resources and MOOCs

Marchewka D.1, Filip A.1, Urbaniec J.1

1Centre for Distance Learning, Jagiellonian University, Poland

The presentation will cover multiple scenarios of content delivery - both in the form of Open Educational Resources (OER) and Massive Online Open Courses (MOOC). Authors will try to answer the question when there is a need for an own platform and when putting MOOC on existing platform like Coursera or edX is sufficient. If own platform is chosen, which software will be the best solution - well known Moodle, some dedicated software like edX open source server or cloud-based hosting service. Comparison will be made taking into account the students’ needs, teachers’ habits and workload on IT staff. Answers given during presentation should help universities to develop their model of creating and delivering both MOOCs and OERs.

A computational study of solid tumour growth and angiogenesis in 3D

Nieć M.1, Psiuk-Maksymowicz K.1

1Institute of Automatic Control, Silesian University of Technology, Poland

The aim of this study is to develop the model that would allow simulation in three-dimensional spatial domain of two processes: solid tumour growth and angiogenesis. Angiogenesis is a physiological process of growth of new blood vessels from existing vasculature. It also takes place in case of invasive tumour growth when cells are subjected to the hypoxic conditions. In order to model the dynamics of growing tumour continuous multiphase theory was applied. The model distinguishes tumour and normal cells of three types: proliferating, quiescent and necrotic. This part of the model is complemented by reaction-diffusion equations for oxygen and vascular endothelial growth factor. The process of angiogenesis is modeled by discrete model. It enables modelling growth and sprouting of the new vessels, creation of functional vessel loops and simulation of the blood flow. Depending on the model settings and parameters model allows for creation of the vessel network that correspond to the reality. A number of simulations were performed in order to present vasculature of different tortuosity and density and resulting shape of the tumour. Additionally results ware compared to the models of Hahnfeldt type. Developed model captures the behaviour of tumour growth and angiogenesis. It may be used to analyze the perfusion and oxygenation of the tissue. Model provides the basis of the model which considers drug treatment and 3D distribution of the drug.

This work was supported by the National Science Centre (NCN) under grant 2011/03/B/ST6/04384, and performed using the infrastructure supported by POIG.02.03.01-24-099/13 grant: “GCONiI - Upper-Silesian Center for Scientific Computation”.

Modeling the dynamic of insuline-glucose subsystem using multiagent approach based on knowledge based communication

Meszyński S.1, Sokolov O.1

1Faculty of Physics, Astronomy and Informatics, Nicolaus Copernicus University, Poland

Mathematical analytical modeling and computer simulation of the physiological system is a complex problem with great number of variables and equations. The objective of the research is to describe the insuline-glucose subsystem using multiagent modeling based on intelligence agents. Such approach makes the modeling process easy and clear moreover it can easy to add new agents to investigations. In the proposed approach a multi-agent system was used to build the model glycemic homeostasis, using agents to represent the selected elements of the human body taking an active part in this process. Each of the agents has features that allow you to restore the basic properties of the simulated organ. The developed system also has multi-agent knowledge base, through which it is possible to analyze the accuracy of prediction of work and pathological conditions simulated process. The results obtained in the framework of multi-agent system actions are consistent with the results, which are obtained by modeling these insulin-glucose models using analytical modeling. The proposed approach allows for a more intuitive modeling of physiological processes, where each agent can mean either a specific physiological process as well as a separate organ involved in the formation of a physiological process. By arming of the system knowledge base, obtained a tool for simulation, analysis and prediction of simulated physiological system.

Quantitative criterion of the collective behaviour of bacteria

Czapla R.1, Mityushev V.1

1Department of Computer Sciences and Computer Methods, Pedagogical University, Poland

The study of the collective behaviour of bacteria is a challenging problem in biology. Recent mathematical modelling of the collective behaviour was discussed by S. Gluzman, D.A. Karpeev, L.V. Berlyand [Effective viscosity of puller-like microswimmers: a renormalization approach, J.R. Soc. Int., 2013] and other papers cited therein. In the present talk, we propose a new quantitative criterion of the collective behaviour and study the experimental locations of bacteria subtilis (in the form of 31 film frames) in a very thin liquid film [A. Sokolov, I.S. Aranson, J.O. Kessler, R.E. Goldstein, Concentration Dependence of the Collective Dynamics of Swimming Bacteria, Phys. Rev. Lett., 2007]. We propose a new criterion of the collective behaviour based on the basic sums (BS) (Eisenstein-Rayleigh sums) outlined by V. Mityushev, W. Nawalaniec [Basic sums and their random dynamic changes in description of microstructure of 2D composites, Comp. Mat. Sci., 97, 64–74, 2015]. Following this paper we introduce a constructive method to generate theoretical distribution of bacteria in a representative cell. Next, we calculate the theoretical BS which corresponds to chaotic non-collective behaviour and compare them to the sums obtained from the experimental distributions of bacteria. Comparison of the theoretical and experimental BS shows the difference between chaotic and non-chaotic distributions. In the idealized case, a non-chaotic distribution can be represented by a regular location. Our results demonstrate hidden regularity of bacteria what confirms their collective interactions, hence, the collective behaviour. This approach can be used in general biological investigations of the locations and interactions of statistically homogeneous random objects.

Comparison of bending rigidity coefficient obtained using both molecular dynamics and vesicle fluctuation analysis

Drabik D.1, Kraszewski S.1, Langner M.2

1Department of Biomedical Engineering, Wrocław University of Technology, Poland

2Lipid Systems sp. z. o. o. [Ltd], Poland

The lipid bilayer elasticity, described by bending rigidity coefficient, plays an important role in cell functioning. The determined value of the coefficient depends on the experimental method used. We combine two different techniques to derive insight on the relationship between molecular level phenomenon and macroscopic mechanical properties. Vesicle fluctuation analysis (VFA): A series images of GUVs labelled with Bodipy fluorescent probe were collected using LSM 510 (Zeiss, Germany). The images were then numerically analyzed using statistical approach described in details elsewhere. Molecular dynamics (MD): The full-atomic 0.5 ns simulation of 176 POPC lipid molecules was performed using NAMD software (CHARMM27, NPT). In order to derived the bending rigidity coefficient the procedure described by Levine et al. was used. Specifically, lipids are represented as vector. FFT is applied to obtain correlated with the bending rigidity coefficient longitudinal power spectra. The longitudinal coefficient power spectra generated from MD simulations provides mean to determine the bending rigidity coefficient, which was equal to (45±12) [fold KbT]. Bending rigidity coefficient obtained from VFA was equal to (47±13) [fold KbT]. Presented data shows that results of experimental data (VFA) are in agreement with the predictions derived from MD. This correlation makes the development of MD simulations as based predictive tools of membrane macroscopic properties possible.

Acknowledgement: This work was supported by DEMONSTRATOR+ grant UOD-DEM-1-027/001. Calculations have been carried out using resources provided by Wroclaw Centre for Networking and Supercomputing (, grant No. 274.

Human-Machine interface with online gesture classification towards dexterous manipulators teleoperation

Mankowski T.1, Tomczyński J.1, Kaczmarek P.1, Kasiński A.1

1Institute of Control and Information Engineering, Poznan University of Technology, Poland

The objective of the research was to develop a user-friendly human-machine interface relying on EMG signals registered at operator’s forearm. The main assumption was enabling the user to control a multi-DOF manipulator with smooth learning curve. The interface has to be intuitive and do not interfere with other actions performed by the operator, while minimizing the possibility of unintentional triggering. A modular amplifier system was developed, allowing for reliable multichannel EMG acquisition in environments with high interference levels. A software framework for online data analysis was developed. A neural network with softmax output layer and entropy-based trust function, enabling the classifier to avoid false detections, was implemented. An exemplary manipulator, a five-finger dexterous gripper was proposed as testbed for responsiveness evaluation. The amplifier system allows for acquisition of EMG data with high SNR and low artifact levels without hindering user movement. Several classes of gestures were taken into consideration, evaluating their mutual precision and recall rates. The classifier proved to be efficient in rejecting gestures that were not present during training session, thus can be used as a reliable control interface. Classified gestures were mapped to perform various gripper actions. Proposed EMG acquisition hardware and classification algorithms provide a novel way of human-machine interaction. Implementation of classification trust valuation allows the system to reject activity not present in control gesture group. This allows the user to perform unrelated actions without triggering the interface.

Study on CIE Exoskeleton interface for paraplegic users

Kaczmarek P.1, Kabaciński R.1, Kowalski M.1

1Institute of Control and Information Engineering, Poznan University of Technology, Poland

Restoring walking abilities to people with lower extremity paralysis is important task. Lower extremities exoskeletons offer both restoring mobility functions and physiotherapy at the same time. One of the biggest challenges in area of exoskeletons for paraplegic users aside from energy requirements and mechanical design is intuitive and robust interface which will allow operator to initiate and control walking by himself. Such interface would allow not only to improve physiotherapy sessions effectiveness, but also would enable to use exoskeleton for daily locomotion. In this paper concept of such interface, based on center of pressure location and exoskeleton pose is proposed. In study on exoskeleton interface for paraplegic users CIE Exoskeleton was used. CIE Exoskeleton is equipped with four actuated joints in hips and knees, which is minimal set of actuators to enable paraplegic operators to walk in exoskeleton with use of crutches. Proposed control interface bases on state-machine, in which movement patterns are divided into sub-states and transitions dependent on exoskeleton pose and center of pressure estimated location. Operator chooses type of movement by use of joystick from following set: walking, siting down, and standing up. Cyclic movements and sub-states transitions are initiated automatically based on center of pressure localization and operator pose. User interface was tested on healthy subject to verify its assumptions. Results show that this type of interface can be used to initiate and control movement of exoskeleton.

Local control approach for CIE exoskeleton

Kaczmarek P.1, Kowalski M.1, Kabaciński R.1, Kasiński A.1

1Institute of Control and Information Engineering, Poznan University of Technology, Poland

Lower limbs exoskeleton for people with mobility impairments is robot that surrounds users legs and torso, and can be used to support his or her locomotion and rehabilitation. Developing control methods dedicated to exoskeleton is attractive research issue as it significantly influence its performance, includes dealing with complex, nonlinear object and close physical interaction between robot and user. Research has been conducted with use of CIE exoskeleton that enables performing basic locomotion patterns like: siting/standing and walking with crutches. It consists of modules with actuated hips and knees and passive ankles. Control task is performed by master controller which is responsible for the interface and generation of joints trajectories and distributed slave controllers in actuated joints forcing realization of these trajectories. During study effectiveness of predefined trajectories reproduction in local control scheme was examined. Slave controller inputs are signals related only to the state of a particular joint the state of other joint is not accessible. Loads generated by other joints movements and gravity force were treated as an external disturbing torque. This torque was estimated based on Kalman filter including simple pendulum model and compensated in a feed forward loop. The simple pendulum model was identified for a particular segment with prediction error method. Control methods were implemented and tested on CIE exoskeleton. Test proved that proposed method with disturbing torque estimation increased quality of control.

Simulation of chemo-mechanical processes in porous hydrogels. Identification of the model parameters

Kazimierska-Drobny K.1, Kaczmarek M.1

1Institute of Mechanics and Applied Computer Science, Kazimierz Wielki University, Poland

The reactivity of hydrogels causes that the effective specification of structural and physical properties demand the incorporation of complex experimental studies and the application of mathematical model. This paper presents the identification procedures of transport, mechanical, and coupled chemo-mechanical and chemo-osmotic parameters of porous PVA hydrogels. The macroscopis models of chemo-mechanical deformation based on Biot poroelasticity model with chemo-mechanical couplings. The studies of chemo-mechanical and transport processes were carried out with specially designed experimental set-up. The chemical, mechanical and thermal deformation of sample were continuously recorded by time of flight method using Optel ultrasonic testing system. The tested material are PVA hydrogels. The process of transport of salt into hydrogel material results in concentration changes in the reservoir and deformation of hydrogel. The procedure of identification was carried out with the optimization methods, implemented in the Matlab. The results of identification for 25% PVA are: diffusion coefficient D=6.08e-10±0.13 m2/s, retardation factor R=1, drained Young modulus Es=0.902±0.030 MPa, Poisson’s Ratio ns=0.31±0.03, hydraulic permeability K=3.29±1.16 m5/kg×s; chemo-mechanical parameters d=417.11±67.12 m2/s2, g=-2.88±0.99 m2/s2, chemo-osmotic parameters kc=2.58±1.24 m2/s2. The obtain results have a crucial role for the recognition of the associated phenomena: mechanical, chemo-mechanical, chemo-osmotic and transport behaviour of hydrogels and may form the basis for the broadening of the works towards other chemically sensitive materials like gels and soils.

The use of Direct Metal Laser Sintering for the manufacturing of prosthetic constructions of CoCrMo

Bojko Ł.1, Ryniewicz A.1

1Faculty of Mechanical Engineering and Robotics, AGH The University of Science and Technology, Poland

New technology to receive prosthetic constructions from CoCrMo alloy is Direct Metal Laser Sintering (DLMS). The aim of the study is to evaluate the DLMS technology and casting technology, in terms of the microstructure and hardness of CoCrMo alloy. The material of the study were samples of alloy called EOS CobaltChrome SP2 made DMLS technology. The control group consisted of CoCr sample called Remanium GM 800+ made centrifugal casting technology. Both the materials made of 10 rectangular samples. Microstructural studies of samples were performed on scanning microscope with the X-ray attachment for analyzing the chemical composition. Hardness measurements performed in a Vickers test. The hardness was determined based on 25 measurements on each sample. The microstructure of the EOS CobaltChrome SP2 alloy with DLMS technology showed homogeneity chemical composition. The SEM images were visible scratches grain boundaries, which, through its various geometric shapes provide the compactness of the structure. In the cast microstructures with Remanium GM 800+ centrifugal method was observed dendritic microsegregation and porosity. On the basis of micromechanical studies it was found that the alloy from DLMS technology has an average hardness value of 629 HV, and a casting an average value of 513 HV, and in the first method were visible traces of slip lines resulting from inhomogeneous hardness in penetration zones. On the basis of tests carried out it can be stated that the DMLS in terms of microstructure and hardness, is a new preferred technology for manufacturing the prosthetic structures. It can be an alternative to methods based on traditional casting.

Is based on the modeling of the hip joint can discover the excellent characteristics of biobearing?

Ryniewicz A.1

1Faculty of Mechanical Engineering and Robotics, AGH The University of Science and Technology, Poland

Consideration is base to identification of condition of friction and lubrication in proper hip joint. The methods included: unique investigation of geometry of knot friction with shape analysis cooperating surfaces and arrangement of cracks, determine multi-layers build of pin and acetubulum biobearing with analysis of cartilage tissue, analysis of structures and properties reological synovial fluid, numeric analysis of dislocations, deformation and tensions in loaded biobearing. In simulate analysis shows relatives between bone cartilage layers and synovial fluid in biobearing. Lubricating mechanism stimulate diverse dislocations in loaded cartilage structure pin and acetubulum, which cooperate with synovial fluid. Synovial fluid, possessing expand spatial configuration components and specific reological properties fills geometrical diverse zone of cooperation. Bioelastohydrodynamic lubrication in proper hip joint make impossible direct point of contact of bearing surfaces in the conditions of loading joint while movement action. Analysis of stress confirmed, that they are amortized in in cartilage tissue and transferred to interiors of osseaus structures, which are considerable more resistant and biological prepared to remodeling under load. Hitherto existing theory should be competed: about resulting effect of geometry knot friction depending on interception of load through generated pressure in system cracks crated through anatomic shape surfaces head and acetabulum as well as about effect resulting from structural property and biochemical synovial fluid, in which observes itselves deforming in connection with occurrance of normal tensions.

Biomechanics of prosthetic bridges

Ryniewicz A.1, Ryniewicz W.2

1Faculty of Mechanical Engineering and Robotics, AGH The University of Science and Technology, Poland

2Faculty of Medicine, Jagiellonian University Medical College, Poland

The basic clinic requirement for the stable prosthetic restoration is such a designing of its construction that in time of mastication it does not cause any traumatic overload. The aim of the study was the analysis of changes of stresses occurring in the act of occlusion load in dental-alveolar complex and bone structures of the lower jaw as well as applied in this system of the prosthetic bridgework. Four variants of constructional solutions of bridgework in the lateral section of the lower law were investigated. Models were prepared on the basis of the procedure of the spatial imaging of the object from the CT. Such a model, having had the developed computer procedure applied, was analyzed with the finite elements method indicating the distribution of stresses in the constructions of prosthetic bridgework, abutment teeth, parodontium tissue and two-layered structure of the lower jaw in mastication conditions. It was stated that in mastication conditions, considerable differences occurred that were related to localization, values and the range of areas of pile-up of stresses for various configuration of spans of bridgework. In the constructions of bridgework, the unsafe areas with regard to occurrence of pile-up of stresses were as follows: contractions of spans, junction of the span and crowns of abutment teeth and alloys of crowns of these teeth. In the simulations, amortization and suppressive function of the suspensive apparatus of the abutment teeth. The numerical method indicates the areas of strain in the construction of the prosthetic bridgework as well as in base tissues. That gives the basis for planning the optimal solution in the individual conditions of a given patient.

The use of 3D modeling hip belt in planning of resurfacing surgery

Ryniewicz A.1, Madej T.1, Ryniewicz A.2, Bojko Ł.1

1Faculty of Mechanical Engineering and Robotics, AGH The University of Science and Technology, Poland

2Faculty of Mechanical, Laboratory of Coordinate Metrology, Cracow University of Technology, Poland

The aim is to develop the method of biometrological analysis in the complex of reconstructed hip and virtual application of resurfacing, selected and positioned on the basis of morphometric measurements of hip belt of an individual patient. Preoperative application procedure was carried out on the model of the reconstructed hip belt with CT. On the basis of biomeasurements the choice of the hip resurfacing endoprosthesis DHRS and virtual location acetabular component in the pelvis bone and femoral component on the head of femoral bone were performed. Designated diameter of the femoral head were: the largest 50.0 mm and the smallest 47.6 mm, and the femoral head-neck index - 1.36. The neck-shaft angle was 127°. The antetorsion angle of the femoral neck was 12°30′. The acetabulum was oval and its symmetry axes were: 54.6 mm and 53.7 mm and the depth - 23 mm. The inclination angle included between the acetabular lip and horizontal plane was 48°20′. The anteversion angle between the axis of acetabulum and its projection on the frontal plane was 14°40′. The support of the implantation using the 3D reconstruction, solid modelling, biometrology and fixation of the endoprosthesis in virtual bone structure allow to optimalisation of its selection. Such selection will secure geometrical parameters in the range of bone bed, the depth of acetabular component, inclination and anteversion angles, the femoral neck-shaft angle, the antetorsion angle of neck and perform the biomechanical axis of limb, and the physiological point of rotation in the implanted joint as well as the location of the axis of rotation of both joints in the horizontal plane.

The use of patterns to evaluate the quality of mapping in CT

Ryniewicz A.1, Knapik R.1, Sładek J.1, Bojko Ł.2

1Faculty of Mechanical, Laboratory of Coordinate Metrology, Cracow University of Technology, Poland

2Faculty of Mechanical Engineering and Robotics, AGH The University of Science and Technology, Poland

In the diagnostic and preoperative procedures using the CT required is imaging and reconstruction of the spatial structures of the tissue with the analysis of dimensionally-shaped. The aim is to analyze the selected procedures for assessing the accuracy of mapping of geometrical parameters in CT using the pattern to the compare diagnostic equipment at an angle measurement applications. Tomographic studies of patterns balls and pattern of the Ball Bar type was performed using 64-layer of CT scanner Somaton Siemens Cardiac. The accuracy of the mapping was done by comparing the reconstructed of patterns models based on CT imaging of geometrical parameters contained in the certificates. Implementation of the comparison performed by Best Fit using programs: 3D Reshaper, Poly Works, Geomagic, Amira. Results The results of the assessment of the mapping of selected geometric parameters tested patterns are shown in the form of maps of accuracy mapping the surface of the patterns balls, determination of the diameters value and experimental standard deviations for the balls made of different materials. In addition, the measurement uncertainty distance between centers of the balls and measurement uncertainty of balls diameter were determined for the patterns of the Ball Bar Conclusions CT introduces an error of accuracy the geometry mapping resulting from the size, shape and material, position of the scanned surface of a spherical relative to the gantry and the location of the object in space research.

Experimental studies and modeling of selected layers of the trachea on the basis of EBUS

Ryniewicz A.1, Soja J.2, Ryniewicz W.2, Sładek K.2

1Faculty of Mechanical, Laboratory of Coordinate Metrology, Cracow University of Technology, Poland

2Faculty of Medicine, Jagiellonian University Medical College, Poland

The aim is to replicate the shape of the trachea and the layered structure of the wall on the basis of diagnostic tests of endobronchoskopy ultrasound (EBUS) using numerical modeling and reference patterns. Research trachea of patients with bronchial asthma and reference models one and two-layer performed in the Laboratory of Diagnosis Invasive the Thoracic, UJ CM using EBUS system. In diagnosis and therapy of bronchial asthma it is important to structure the first and second layer, which consists of the mucosa, submucosa and smooth muscle. Therefore, in the procedure for distinguishing the structures according to the preset threshold, considered selectively surfaces of first and second layers. The test results are presented in the form of digital images obtained with the system EBUS, which were then segmentation by thresholding with hysteresis and computer analysis. Separated areas corresponding to particular tissue layers of the trachea by specifying upper and lower thresholds in grayscale. Curves, which limited the selected areas were generated. Then the spatial reconstruction comprising combining of flat models and generate solid - corresponding to the structure of the trachea were performed. Study allowed thickness measurements of the first and second layers and the surface of the selected cross-sections perpendicular to the axis of the trachea, and 3D configurations allow a determination of the bronchial wall deformation. To conclude, numerical modeling and 3D reconstruction of anatomical structures and patterns on the basis of EBUS may be a method to evaluate the shape and distinction layered structure of the tracheal wall and to determine the accuracy of mapping.

Artificial neural networks in the diagnosis of pigmented skin lesions: A review

Jaworek-Korjakowska J.1

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

The development of computer-aided diagnostic systems for the examination of dermoscopic color images can assist in the early diagnosis of malignant melanoma, which has the potential to reduce the increasing mortality rates worldwide. Skin cancer is the most commonly diagnosed type of cancer in all people, regardless of age, gender, or race. The most malignant type of skin cancer is melanoma which is a dangerous proliferation of melanocytes. A review of artificial neural networks applied to microscopic images of skin lesions has been done based on the existing literature. The review aims to: provide an extensive introduction and clarify the terminology used in the literature and group together relevant references about different methods applied to dermoscopic image classification. Various approaches for implementing computer-aided diagnosis systems based on artificial intelligence and their standard workflow components are reviewed. Comparing the classification results obtained with neural networks with others in the field of automated diagnosis of pigmented skin lesions it becomes clear that the artificial intelligence has a great potential in the field of skin lesion classification. One of the main tasks of modern dermatology is the detection of melanoma in its early stage of development. This can be achieved by combining new instruments with the computer-aided diagnosis systems based on sophisticated classification methods. The aim of these systems is to increase the accuracy of melanoma recognition. The current review suggests that computer-aided system is a practical tool for dermoscopic image assessment and could be recommended for both research and clinical applications.

Make the invisible visible – how to use infrared light to detect early atherosclerosis - Study review

Pociask E.1, Malinowski K.2

1Faculty of Electrical Engineering Automatics Computed Science and Biomedical Engineering, AGH University of Science and Technology, Poland

2Faculty of Health Science, Jagiellonian University Medical College, Poland

Coronary atherosclerosis is the major cause of coronary heart disease. During the recent years we are witnessing a sort of competition between different available cardiovascular imaging techniques to become the next reference method in assessing coronary artery disease in daily clinical practice. The need to get a more in-depth understanding about the atherosclerotic process mainly triggered this competition, hoping to find the technique which can contribute to improve and optimize new devices and treatment methods. Coronary angiography is still the golden standard in daily practice but its major limitation - possible foreshortening and visualizing the lumen only, could hide possible problems within diffuse diseased arteries. Coronary angiography is not sufficient method to provide in-depth knowledge of coronary plaque and to possibility predict its progression or regression. The new intracoronary imaging techniques such as optical coherence tomography (OCT) and techniques which dynamic developed in last five years such as intravascular photoacoustic imaging (IVPA), near-infrared spectroscopy (NIRS) and fusion of NIRS and OCT or like Polarization-sensitive OCT could bring additional values making them potentially extremely useful devices, sometimes crucial in making treatment decision or for evaluation of the given treatment. Presented imaging demonstrates the ability to directly image tissue components in the vessel wall and additionally near-infrared spectroscopy and photoacoustic imaging show high chemical specificity for lipid plaque. This study review discusses the potential role of the new cardiovascular imaging techniques which could become the new golden standard in clinical practice in detect vulnerable plaque.

Modeling crosstalk between ATM and ATR signaling pathways after exposure to IR

Kurpas M.1, Puszyński K.1

1Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, Poland

Ataxia telangiectasia mutated (ATM) protein kinase detects double-strand breaks (DSBs) occurring in the cell in response to various agents, like ionizing irradiation or drugs, while ataxia telangiectasia mutated and Rad3-related (ATR) is activated in the presence of single-stranded DNA (ssDNA). Such forms occur at stalled replication forks or during DNA repair or replication process. Biological reports suggest that ATR module is activated also by presence of DSBs. We examined that the purpose of ATR pathway activation might be resection of DSBs’ ends by repair complexes. We developed simple mathematical model of DSBs and ssDNA detection. Our model is based on Haseltine-Rawlings postulate which binds deterministic and stochastic approach. We use ordinary differential equations for fast reactions and direct Gillespie method for slow reactions. We implemented resection mechanism as a connection between ATM and ATR pathways. Our results show importance of resection mechanism in response to ionizing radiation. We examined that despite faulty double-strand breaks repair, part of lesions is still recognized and repaired with contribution of ATR module. Moreover, we identified single-strand breaks repair as crucial in cell survival. Our results show also that faulty DSBs resection causes existence of unrepaired DNA in cell. Disorders in resection mechanism decrease viability of cell population. Our results confirm thesis that ATR module is activated after ionizing radiation. A factor responsible for this activation is resection mechanism. We show that DSBs resection increases effectiveness of repair and lowers apoptotic fractions of irradiated cells.

Fractal analysis of posturographic time series

Popek M.1, Łątka M.1

1Faculty of Fundamentals Problems of Technology, Wrocław University of Technology, Poland

Posturography is a diagnostic method of choice in many balance disorders. The path traced by center of pressure (COP) is frequently considered as a random walk and analysed using one of many fractal algorithms such as stabilogram diffusion analysis (SDA) or detrended fluctuaction analysis (DFA). These methods yield one, two or even three scaling exponents which serve as a measure of integrity of balance control system. Surprisingly, applicability of fractal methods have not been verified. For a cohort of 21 healthy young adults (11 females, mean age 22 ± 2 years, range 20 to 26 years), we performed a 2 min static posturographic test in eyes open condition. We used Nintendo Balance Board connected to PC via Bluetooth to monitor COP changes. Data acquisition software was written in Java and fractal data analysis was performed with MATLAB. We have determined the short- (STSE) and long-term (LTSE) scaling exponents of anteroposterior posturographic time series using three methods: SDF, DFA (linear detrending), and, recently proposed, balanced estimator of diffusion entropy (BEDE). The group-averaged values of STSE were equal to 0.75 ± 0.05, 1.0 ± 0.1, and 0.73 ± 0.06. The values of LTSE were equal to 0.19 ± 0.08, 0.26 ± 0.07, and 0.20 ± 0.08. For all three methods, we determined the crossover point between the scaling regions: 0.5 ± 0.1 s, 1.8 ± 0.4 s, and 0.6 ± 0.1 s. STSE of posturographic time series is an effective predictor of fall risk. We demonstrated that DFA estimate of STSE is significantly different (ANOVA, p<0.001) from those of the fractal methods which do not rely on the decomposition of time series into two independent components: polynomial trend and signal.

Designing a computer-based application for subjective tinnitus evaluation

Kostek B.1, Poremski T.2

1Faculty of Electronics, Telecommunications and Informatics, Gdansk University of Technology, Poland

2GEERS Akustyka Słuchu Sp. z o.o., Poland

Measuring tinnitus is an important part of the process of helping patients. A standard procedure employs an audiometer to determine both Tinnitus pitch and its severity. However, patients are rarely able to identify these two factors correctly, thus the process is not accurate. Thus, the objective of this paper is to present a computer-based application for diagnosing subjective Tinnitus in patients and check whether this tool enables to shorten this process and makes it more reliable. A touch-screen graphical interface is proposed which patients use while measuring their own Tinnitus. The measurement method is based on sound synthesis and it is then compared to the standard method which employs an audiometer. For the purpose of this study a group of patients is examined employing the designed application against audiometric-based Tinnitus measurements. Wilcoxon test is then utilized to check whether there are statistically important differences between these two methods. Resulted from this study is that patients while using the synthesizer are able to estimate their tinnitus twice as fast as when they do that on the basis of the audiometer. Also, evaluation of the Tinnitus is more accurate. Wilcoxon-based analysis proves that the results obtained with the use of the application designed and an audiometer are statistically different. The authors demonstrate that when patients measure their own Tinnitus, they make it in a shorter time in comparison to the standard approach. Furthermore, they evaluate this process more friendly, and the results are better correlated to their own Tinnitus. Overall, they have a greater profit using the designed touch-screen application.

Simulation and visualisation of flocking behaviour and SIR epidemic model

Zoń N.1

1School of Informatics, University of Edinburgh, United Kingdom

Biologically inspired engineering is a new branch of science that applies patterns observed in nature to problems from other fields of science. Two interesting examples of bio-inspired patterns are the flocking behaviour and the SIR epidemic model. Flocking [1] is a behaviour that can be manifested by groups of animals moving together, without the need of a global leadership. It is a complex behaviour, incorporating varying amounts of three forces: cohesion, separation and alignment. The SIR epidemic model [2] describes how an infectious disease spreads through a group of individuals, taking into account a number of parameters such as the disease’s lethality and the contagiousness of an infected individual. When combined, flocking and the SIR epidemic model constitute a representation of how a disease might spread through a large group of animals. This paper talks about the first simulation of the model using libGDX - a Java-based framework for game development [3,4]. It includes a graphical representation of the population, along with a menu, which enables the user to adjust the parameters of the simulation, allowing for a close examination of the system.

[1] Fernandez-Marquez, J.L., Di Marzo Serugendo, G., Montagna, S., Viroli, M. and Arcos, J.L., Description and composition of bio-inspired design patterns: a complete overview, Nat Comput (2013) 12:43-67.

[2] Allman, E. S., and Rhodes J. A., Mathematical Models In Biology. Cambridge, UK: Cambridge University Press, 2004. 280-284.

[3] Saltares Marquez, D., and Sanchez, A. C., Libgdx Cross-Platform Game Development Cookbook. Birmingham: Packt Publishing, 2014.

[4] Nair, S. B., and Oehlke, A., Learning Libgdx Game Development. Birmingham: Packt Publishing, 2015.

Application of multichannel EMG signal analysis in estimation of the motor units discharge patterns

Mazurkiewicz P.1

1Institute of Control and Information Engineering, Poznan University of Technology, Poland

Analysis of the multichannel EMG signals acquired around the innervation zone (IZ) allow to observe propagation of the Motor Unit Action Potentials of active Motor Units (MUs) from IZ to tendons. Utilization of the observations allow to estimate the position of the neuromuscular junctions (NMJ) of the active MUs. Information about the position of the NMJs, MUAPs shape and time delays between MUAPs on every channel, can be used to estimate discharge patterns of the active MUs. In this paper comparison of 3 discharge pattern estimation methods have been compared. Two groups of signals have been used. The first group consist of synthetic multichannel signals. For the second group real signals have been recorded from m. biceps brachii with 16 channel EMG amplifier from 5 patients. From the recordings, samples of signals with observed propagations MUAPs from NMJs have been selected. Signals from both groups have been used to estimate discharge patterns of the particular MUs with 3 different methods. For every estimation quality factors have been calculated. Results indicate that estimation method based on the V shaped motor unit action potentials gives the best results but with more computational complexity. In case of analysis of the real signals, compared methods were very sensitive to signal quality. Obtained results indicate that EMG signals recorded above NMJs and proposed analysis methods allow to estimate with some probability discharge patterns of the active MUs. This methods need further development to attain better efficiency, immune to signal quality and lower computational complexity.

Learning from “Mother Nature” in material processing on bio molecular templates

Wójcik M.1

1Department of Biomaterials, AGH University of Science and Technology, Poland

Observations of ‘Mother Nature’ show that the best smart materials have been produced by biological macromolecules using transport properties of cell membranes linking the chemical as well as mechanical, electrical and optical domains giving biological structures at several levels of mineralization as a result. Theoretical analysis of a problem indicates that the Lorentz’s force is the driving force for construction of biological structures resulting from thermal hydrolysis reaction capabilities, including compounds of silicon, calcium and phosphorus. The discovery of the mathematically defined mechanism, gives some opportunities for steering of kinetic processes for production of structures with controlled pore size distribution parameters. This can be used to reproduce many of other structures, for example the bone gradient structures of the hydroxyapatite or mineralized human tissue with bio-medical use for controlled release of drugs during therapeutic or surgical treatments. In the article, along the lines of the multilevel structures of diatoms, the composite model was showed and was designed basing on matrix of conductive polymer filled with polymer or ceramic material with magnetic or magnetic and conductive properties. Programming of such structures and their 3D printing gives a real possibility for realization of the structure closer to the best templates from ‘Mother Nature’. The composite materials for biomedical targets in electrical stimulation of cells, or the development of the new electrode to record and stimulate nerve cells to study epilepsy as well as for the reconstruction of nerves or muscles or material for bone implants filled with drugs can be as a proposed result from presented investigations.

Individual Feature Selection for Protein Fold Recognition

Chmielnicki W.1

1Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University, Poland

Protein fold recognition is a key problem in molecular biology. This is a multi-class problem and one of the approaches to such a task is to decompose it to the binary problems and then use the binary classifiers. Usually the same feature vectors are used by all these classifiers. However binary problems generated by this scheme are very different so it seems reasonable to use individual feature selection for each of them. In this study decomposition technique: error correcting output codes (ECOC) have been used. The binary code for each class is generated and these codes create a code matrix. The columns of the matrix define the binary problems which we have to solve. For each problem individual feature set have been selected. In the experiments data sets derived from SCOP database and the feature vectors developed by Ding and Dubchak are used. The support vector machine classifier have been used to test the solution using several different approaches based on ECOC technique. The recognition ratios obtained using the proposed strategy (59.6% - 63.6%) are better than these obtained using the same feature vectors for all binary classifiers (58.5% - 62.4%) no matter what approach is used. The improvement of the results is also seen when we compare the average results of the binary classifiers. The average recognition ratio for all binary classifiers using this new approach is 80.9% comparing to 78.6% using standard approach. The proposed method improves the recognition ratio of the final multi-class classifier due to individual approach to each binary problem. This result can be also used to identify which features are significant for a specific classification problem.

Comparative modelling of the second member of a sodium-dependent glucose cotransporters (hSGLT2) using Modeller program

Katolik P.1, Kraszewski S.1

1Faculty of Fundamental Problems of Technology, Wrocław University of Technology, Poland

It is believed that hSGLT2 is responsible for the majority of the renal glucose reabsorption (>80%) and due to the malfunction of those symporters glucose is excreted in urine, which leads to the deviation of glucose levels in human body. The pharmaceutical industry has already targeted these proteins with inhibitors in order to control the concentration of glucose in blood of patients with type II diabetes. However, despite promising clinical data it is still little to be known about the effectiveness of these drugs on an atomic scale. Modeller is a program which takes provided by user targeted sequence of amino acids and templates structures and through comparative modelling constructs a three dimensional model of a desired protein. Although many difficulties have been encountered and many models and procedures have been taken under advisement we readapted this advanced homology modelling program to our needs. Here we report to have constructed a functional molecular structure of the hSGLT2. The structure was fully build and has no gaps or missing parts. It was constructed on a base of related crystallized structure of bacterial vSGLT with certain further adaptation. The model was successfully tested through deviation of RMSD function during MD simulation in its native environment. We were able to identify glucose and Na binding sites with a relation to those identified in vSGLT as both of them share the same 1:1 stoichiometry. We recognize constructed model to be suitable for further analysis and development of existing and potentially new inhibitors as a treatment of type II diabetes or even other related with the SGLT group diseases using advanced MD simulations.

The change of daily protein intake depending on the type of diet with the use of computer program DietaPro

Kwiatkowski J.1, Ostachowska-Gąsior A.1

1Department of Hygiene and Dietetics, Jagiellonian University Medical College, Poland

DietaPro is a computer program designed to meet the energy and nutrients demand in any kind of diseases in which a specific diet is essential for health status and also to recognize dietary requirement and eliminate any dietary mistakes in healthy children and adults. DietaPro can be useful when it is necessary to quickly change diet parameters and prepare a menu tailored to the specific nutrients requirement in different health problems. The possibility of controlling the content of protein (per kg of body weight) in the daily diet is the next function of this program. The aim of presented analysis is to show the usage of this function of DietaPro in the nutrition of person with chronic kidney disease on the chosen example. DietaPro was used to create a diet for 61 years old man (body mass - 82kg; height-174 cm; sitting life style) with diagnosed IV degree of kidneys failure (GFR=23). In this particular example, the task for DietaPro was to indicate and choose food products poor in protein (recommended 0,6g/kg of body weight/day), saturated fatty acids (SFA<7% daily energy), cholesterol (max. 200mg/day), simple sugars (<10% daily energy) and with proper amount of dietary fiber (20-30g/day) and some minerals (phosphorus max. 700mg, potassium max. 2700mg, sodium max. 1500mg) to meet specific requirement of people with kidneys failure. DietaPro was created on the basis of Microsoft Visual FoxPro 9.0 with the use of MSDN (Microsoft Software Developer Network) - enabling the access to all tools, server and client systems and does not require high class computer devices. The program may be used on the Windows platform in every version (from Win95 to Windows 8.x). The program with data base is estimated as 50 MB and needs minimal amount of operative memory (from 256 MB). The rapports and email messages containing the menu and the results of their analysis may be created in PDF directly from this program. The analysis of a daily diet and nutritional habits of a presented man and its comparison with recommended values in such advanced kidneys disease showed many nutritional mistakes. Too high intake of protein (1,1g/kg body weight), SFA, table sugar, cholesterol and phosphorus (more than twice in case of phosphorus) were stated. As a result of DietaPro applying all dietary mistakes were eliminated from a diet of this man and a special menu adjusted to his health condition and organism efficiency was created. DietaPro enables to detect nutritional mistakes and to create very quickly a healthy diet for every person in each situation, including urgent situations in which habitual diet needs to be replaced by special diet that becomes the essential factor for the quality of life. It should be also pointed that the usage of DietaPro in practice may help in accurate analysis of nutritional demands in different patients and make the collaboration between physician and dietician easier.

Drug delivery system created from carbon nanotubes and supramolecular compounds

Jagusiak A.1

1Department of Medical Biochemistry, Jagiellonian University Medical College, Poland

There is a great need for research on drug delivery systems (DDS). Research on DDS lowering side effects of chemotherapeutic agents are conducted by a number of teams. Presented system is composed of carbon nanotubes (CNT) and Congo red (CR). CNT have a large surface area of binding molecules with aromatic structure. CR molecules associate together and create supramolecular ribbon-like structure. Such construction provides the ability to bind drugs. The combination of both systems is a promising solution. The model drug used in this study is doxorubicin (DOX). The aim of this study was the physicochemical characterization of CNT-CR system and to determine the possibility of using CNT-CR as drug carriers. A method for dispersing CNT via CR and binding of DOX to the resulting system was developed. To investigate the structure of CNT-CR complex molecular modelling and microscopic methods: AFM, TEM, SEM were used. Also DLS, and spectrophotometric analysis were made. To evaluate the stability of complex DSC method was used. Drug release was examined at different pH. The results showed the presence of CR interactions with the surface of the CNT. Quantitative analysis of CR binding indicate unlimited CR capacity to bind to the surface of the nanotubes, which clearly increases the binding capacity of doxorubicine type drugs. The greatest release of DOX was observed in the lowered pH=5. It was constructed a new system with high capacity, giving the possibility of simultaneous binding of multiple drugs. It binds to the cell surface and provides a controlled release of drugs in low pH which is present in the endosomes, tumour tissue and bladder. These features may provide sustained local action.

The effect of beclomethasone aerosol deposition location on the function of selected brain structures

Podolec Z.1, Panek D.2, Hartel M.3, Banasik T.3, Podolec-Rubiś M. 1

1CBR MEDiNET, Kraków, Poland

2Department of Measurement and Electronics, AGH University of Science and Technology, Kraków, Poland


This is the first effort to use functional magnetic resonance imaging (fMRI) to study course time and locus of beclomethasone aerosol effects on selected brain structures in humans treated with steroids due to asthma. Functional scans were acquired 7 times at 5 sec intervals over 45 min with beclomethasone and placebo doses occurring at the 5 min mark. The beclomethasone was supplied at a dose of 250, 500 and 1000 mg and placebo, which were administered with the use of integrated spirometer inhalation in order to gain an advantage aerosol deposition in the central airways in asthma (CAD). Structural imaging segmentation was done using subcortical Harvard-Oxford Structural Probability Atlas distributed with FSL neuroimaging analysis software into each subject’s space and a time series was extracted. In order to minimize momentary signal fluctuations the data were averaged in 5 min intervals. The change in signal strength was registered in standard deviation units for each succeeding 5 min segment. BOLD measurements for placebo were put into a common distribution with the mean set to zero and the SD set to 1. After submission of beclomethasone in CAD with doses of 500 and 1000 mg there is a significant decrease in BOLD signal (blood-oxygen-level dependent) in brainstem compared to administration of placebo and 250 mg beclomethasone, followed by the increase in BOLD signal. The fMRI examination enables evaluation of the impact of steroids on the selected structures of the brain. Assessment of the impact of steroids on the brain may be important to know and understand the role of the brain in the pathophysiology, diagnosis and treatment of asthma.

Polygonal modeling of the human cochlea from micro-CT data

Skrzat J.1, Kozerska M.1, Musiał A.1

1Department of Anatomy, Jagiellonian University Medical College, Poland

Information on variability of size and shape of the human cochlea is essential for investigation of the cochlear morphology. Modeling of the cochlea shape may provide new insight into its complex anatomy and function in auditory system. To create 3D polygon mesh models of the cochlea from micro-CT data and compare to volume rendering images presenting cochlear anatomy in multiplanar reconstructions. Evaluation of the algorithms used for building computer models of the cochlea and their usefulness in anatomical researches. The petrous parts of ten human temporal bones were scanned using the microtomograph Skyscan 1172. CT scans were processed with the computer software CTvox and CTAn to perform volume and surface reconstruction of the examined samples. Creating of the polygonal meshes of the cochlea and its modeling was performed by means of MeshLab software. Computer-aided geometric design of the cochlea was presented in a series of figures showing its external and internal morphology. Generated polygonal meshes delivered models presenting surface topography of the cochlea. Application of shading and smoothing algorithms turned out beneficial for better visual perception of the cochlea. Created models of the inner ear revealed location of the cochlea inside the petrous bone and its relationship towards neighboring structures. To conclude: morphology of the cochlea and its geometrical design can be investigated from a stack of micro-CT scans composed into 3D models. Mesh models appeared more convenient for studying morphometrical features of the cochlea than their volume rendering images. 3D models may be helpful in surgical procedures and used in modeling anatomy of the human ear.

Gradient-based parameter estimation of cell signalling pathway models based on normalized semi-quantitative data

Łakomiec K.1, Fujarewicz K.1, Śmieja J.1

1Faculty of Automatic Control Electronics and Computer Science, Silesian University of Technology, Poland

Numerical parameter estimation is a very important and not trivial task during the system identification process. This is because there is no ‘universal’ method that can be applied for all types of identified models. Especially in systems biology, where models have many parameters and small number of experimental measurements, this task is really challenging. Usually, parameter estimation of cellular pathway mathematical models is based on minimizing a given weighted quadratic objective function. This type of objective function takes into account the output of the model and corresponding experimental data collected in subsequent times. The semi-quantitative data usually comes from so-called blots, which after quantization may be compared only within one blot but not between two different blots. In our previous works we performed gradient-based estimation of not only model parameters but also additional scale factors, one factor per blot. Such an approach made the estimation problem feasible but increased considerably computational complexity of the problem. In present work we propose another approach to parameter estimation based on semi-quantitative data that does not require estimation of scale factors. We normalize both the experimental data and the output of the model. The experimental data is normalized ones, before the parameter estimation procedure. The output of the model is re-normalized in every iteration of the gradient-based parameter estimation. The numerical results obtained for a mathematical model of DNA repair process show advantages on the new method.

Acknowledgement: This work was funded by Polish National Science Centre under grant DEC-2013/11/B/ST7/01713.

Some remarks on support vector machines application

Pieter J.1, Student S.1, Fujarewicz K.1

1Silesian University of Technology, Poland

Support vector machines (SVM) technique is a well established method of supervised data analysis. It is widely used in many scientific areas including biology and medicine. This is because of its robustness and effectiveness. Support vector machines, when compared to other methods of supervised data analysis, frequently appear the best choice. Practical application of SVM requires from the user specification of several options/parameters, for example type of the kernel and the regularization parameter C (penalty factor in the minimized quadratic objective function). In general, it should be done by validation on an independent data set or the cross-validation. A very common practice is specification of these parameters based on the user’s experience or by using default options of the numerical procedure. In this work we focus on specification of the penalty term C. We have found that for given class distributions of the training set the optimal value of C depends on the number of samples. Based on this observation we propose a modification of the minimized quadratic objective function. Thanks to this modification the optimal value of C depends only on the type of the data and does not depend on the number of samples, which makes the determination of C much easier for the user.

Acknowledgement: This work was funded by Polish National Centre for Research and Development under grant PBS3/B3/32/2015.

Lateralisation Exercise System for Children Therapy

Gardecki A.1, Kawala-Janik A.1

1Faculty of Electrical Engineering, Automatic Control and Informatics, Opole University of Technology, Poland

In this work of interdisciplinary character basics of a LES (Lateralisation Exercise System) were in short presented. The proposed solution ought to support therapy of a lateralisation phenomenon. The proposed LES is a program, which could be successfully applied to the very popular and quite cost-effective NAO humanoid robot in order to enable work with children affected with inter alia autism. The initial study was carried out at the Opole University of Technology. The NAO humanoid robot was implemented for the purpose of interaction with child in order to improve its attention and to help it to focus on given tasks. The robot also enhances the will of the child for making and repeating exercises. At this stage (due to the ethical aspects) some tests were carried out (for the time being) on healthy participants only. A typical LEC system integrates various components for the information exchange in human-robot systems. Implementation of the NAO humanoid robot plays a significant role in autistic children therapy. In accordance with the literature study these robots are both inexpensive and easy to use. The initial study results were promising. A typical modular LES system enables relatively easy adjustment to technical requirements of different devices and therapist strategy. A control desk with panel allows the therapist to intervene in the process of implementation of the exercise and control of its proper course.

Molecular study of small cell lund carcinoma promotion by catestatin neuropeptide

Drabik D.1, Langner M.1, Kraszewski S.1

1Department of Biomedical Engineering, Wroclaw University of Technology, Poland

Catestatin peptide is a novel and potent inhibitor of catecholamine released from chromaffin cells and adrenergic neurons. Herrero et al. suggested that catestatin might regulate an autocrine process in neuroendocrine secretion trough binding with different acetylocholinic receptor (nAChR) subtypes (Herrero, Ales et al. 2002). The studies of Sciamanna et al. show that nAChRs may influence the secretion of autocrine growth factors thus being involved in the growth and metastasis of malignant neuroendocrine neoplasms through synergizing with the activity of voltage-sensitive Ca2+ channels (Sciamanna, Griesmann et al. 1997). Since neurons in the CNS and PNS can express more than one type of nAChR (Lindstrom et al., 1995; Ullian and Sargent, 1995), it is important to discover molecular roles of catestatin not only on neuronal α7 nAChR but also on its muscle-type (Hughes, Kusner et al. 2006). Automated docking and molecular dynamics studies were employed on muscle-type nAChR membrane protein (PDB ID: 2BG9) embedded in fully hydrated lipid bilayer. Computer simulations showed that catestatin docks at the proximal vestibule of the receptor pore, confirming previous findings. The other outcome is the identification of new high affinity binding site onto β subunit near the membrane surface. These finding indicate potentially different catestatin modes of action, and underline the importance of lipid bilayer in peptide folding before alternative binding. The characterization of association kinetics and identification of specific amino acids may assists the design of catestatin mimetic syntheses strategies to curb small cell lung carcinoma growth related to muscle-type nAChRs receptor.

Implementation of Fractional Calculus-Based Methods for the Purpose of Analysis of EEG Signals

Kawala-Janik A.1, Baranowski J.2, Bauer W.2, Podpora M.1, Schneider P.1

1Institute of Electromechanical Systems and Industrial Electronics, Opole University of Technology, Poland

2Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

In this paper comparison of signal processing methods based on implementation of fractional calculus for the analysis of the EEG signals was presented. The signals were gathered from various devices such as inexpensive EEG headset, open source test data or professional electroencephalograph. The paper presents the efficiency of the proposed method for each kind of data. The applied signal processing methods were based on fractional order calculus, which are both quick and efficient. Some basic fractional filters were applied. The experiments at this stage were carried out in an off-line mode. MATLAB software was for the purpose of computing applied. The results of the study were promising, however the highest efficiency was achieved for the open source test data. This is probably because of the best quality equipment used for the experiments. This is initial study phase only and therefore only few data was examined. Further tests on other biomedical data is planned.

Perspectives of using Markov Chain Monte Carlo for simulation of Primary Health Care patients population behaviour

Bauer W.1, Baranowski J.1, Mitkowski W.1, Tomasik T.2

1Department of Automatics and Biomedical Engineering, AGH University of Science and Technology, Poland

2Department of Family Medicine, Jagiellonian University Medical College, Poland

The purpose of this paper is to show the concept of simulation of Primary Health Care (PHC) patients behaviour. There are two reasons for interest in this area. The first of them is the need for support in decision making regarding financing and management of PHC facilities on levels from national (Natonal Health Fund) to local (PHC provider). The second one is the lack of effective methods for such population modeling. In this paper authors describe an idea of probability distribution modeling and present preliminary results of patient visits simulation over a period of a year. In statistics MCMC methods are a class of algorithms for sampling from a probability distribution. Such probability distribution can be obtained for example by the use of Bayesian data analysis. The main goal in this study is to investigate the feasibility of a probabilistic model that will best explain the dependency between data. It is based on the known information (a priori information, prior) obtained from experts. Then a posteriori probability density function (posterior) is created from the available data. The posterior is used to simulate the population behaviour. This approach guarantees that the solutions are influenced by both the measured data and the knowledge available before. To illustrate the work of general MCMC methods Metropolis-Hastings type algorithm has been implemented. The presented methodology for simulation of patients behaviour in primary health care is promising. Further investigation of using this approach is warranted. The primary areas of research will be among the others numerical efficiency, efficient ways of creating probability density models and objective methods of creating priors from expert knowledge.

Neural model of cooperation of operator with the system of the parallel driving - a set of inputs data

Trzyniec K.1

1Faculty of Production and Power Engineering, University of Agriculture in Kraków, Poland

Currently, human mental load of information age seems to be one of the most important topics [1]. So, the determination of measure of mental effort for a given position seems to be very important for ergonomic reasons (recommendations for corrective and conceptual ergonomics) and organizational (eg. in order to rationally determine gaps and substitutions at work). Unfortunately there has not been developed a methodology assessing the degree of mental work load, which would allow to obtain clear, objective and quantitative results. Because the operation of an artificial neural network can be assessed by two indicators: learning time and number of errors committed by it, and the number of errors committed by the operator and the time to learn the operational activities may be an indicator of his load information and the degree of difficulty of his training, was placed hypothesis that the artificial neural network can model the transmission of information in the system man - machine. The input data for this network will be codes of communication from signaling devices, and output data - their interpretation (understanding). The aim of the presentation is to present a set of inputs data to the neural model of cooperation operator with the system of the parallel driving used in precision agriculture. The collected data are in the graphic form and it shows the differences between the driving track determined using the GPS on the navigation screen, and the trajectory of the vehicle operated by the operator.

[1] Juliszewski T., Ogińska H., Złowodzki M. 2011. Ergonomia wobec obciążeń praca umysłową. W: Obciążenie psychiczne pracą – nowe wyzwania dla ergonomii. Komitet ergonomii PAN. Kraków, 11.

Rapid prototyping in applications to preoperative planning

Dudek P.1, Cieślik J.1

1Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Poland

This work aims to describe computer-aided design and rapid prototyping systems for preoperative planning and fabrication of custom-made medical aims. Advanced manufacturing technologies are used nowadays in the biomedical industry especially in the area of diseases that can affect the human body joints and most often are replaced by artificial joints. These technologies can be also used for preoperative planning of complicated or simple operations, shortening the time of this. 3D models of the patient’s body are generated based on computed tomography image data. After evaluation of the 3D reconstructed image, it was used to manufacturing suitable model from plastic powder using Selective Laser Sintering technology. This model can be used to planning operation, how to use surgical tools, place and direction of cuts, position and shape of implant. This technology can help This technology can also help with soft tissue surgery, for example, allowing to determine the shape of the heart valves needed for transplant or how to lead the operation - the surgeon before the operation can see the organ as a model in the actual size and shape.

Information theory analysis of human gait

Kozłowska K.1, Łątka M.1

1Department of Biomedical Engineering, Wroclaw University of Technology, Poland

Control mechanisms involved with planning of gait are poorly understood. For example, the intended gait speed is determined by interplay of stride length and cadence. Recent studies suggest that in most subjects during walking in unchallenging environment there exist a linear stride length-cadence relationship. To elucidate selection of gait parameters, we employ transfer entropy to investigate the flow of information between time series of step length and duration. Methods:10 healthy students (5 females, age 22 years) walked 400 m on treadmill at three speeds (4, 5 and 6 km/h). Optical tracking of markers attached to subjects’ ankles was used to determine step length and duration time series. Image processing of gait video (200 fps, 720p) was done using C++ OpenCV library. Java Information Dynamics Toolkit (JIDT) was used to calculate transfer entropy T (box kernel with r=0.3) as a function of lag n for information flow between time series of gait parameters. For 4 km/h, the transfer entropy for the information flow from stride duration to stride length T(SD->SL) was equal to: 0.48±0.17 bits (n=1), 0.69±0.19 bits (n=2), 0.64±0.14 bits (n=3) and 0.30±0.09 bits (n=4). For n>4 the transfer entropy was close to zero. The flow of information for higher treadmill speeds (5 and 6 km/h) was not statistically different. The values of T(SL->SD) were: 0.58±0.32 bits (n=1), 0.70±0.24 bits (n=2), 0.62±0.17 bits (n=3) and 0.42±0.19 bits (n=4). To our knowledge, this is the first application of information theory to human gait. We find that information is exchanged almost symmetrically between step length and duration. The transfer entropy has a distinct peak at n=2 which corresponds to a gait cycle.

Symmetry of dynamical cerebral autoregulation

Piątek M.1, Łątka M.1

1Department of Biomedical Engineering, Wroclaw University of Technology, Poland

Cerebral autoregulation (CA) is an important neuroprotective mechanism which maintains relatively stable cerebral blood flow despite changes in cerebral perfusion pressure which may be as high as 100 Torr. It is uncertain whether this mechanism is symmetric, that is to say, if it acts with equal compensatory action to upward as compared with downward changes in arterial blood pressure (ABP). We used transcranial Doppler ultrasonography (TCD) to measure blood flow velocity in middle cerebral artery during a sequence of postural changes (sit->stand->crouch->stand) in 12 healthy subjects (7 females, 22 years old). The ABP was concurrently measured by noninvasive photopletysmography. The CA response to blood pressure rises (stand->crouch) and drops (crouch->stand) were quantified with the help of the recently formulated AMD model. This model yields two hemodynamic parameters: delay of response and autoregulatory strength. The group-averaged values of the delay time 1.8 ± 0.3 s and the autoregulatory strength 0.20 ± 0.06 1/s for ABP rises were not statistically significant from those of ABP drops. The application of a novel and effective CA metric enabled us to prove that dynamic response of cerebral autoregulation to pressure changes is symmetrical.

Patient specific orthoses using RP technologies

Dudek P.1

1Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Poland

Orthotic devices are currently designed to fit a range of patients and therefore they do not provide individualized comfort and function. Using RP methods it is possible to make individual orthosis devices tailored to the individual patient and the disease also characterized by the modern design. A computerized technique for fabricating patient-specific orthotic devices has the potential to provide excellent comfort and allow for changes in the standard design to meet the specific needs of each patient. Reverse engineering methods are combined with rapid prototyping to create patient-specific orthoses. A novel process was engineered to utilize patient-specific surface data of the patient anatomy as a digital input, manipulate the surface data to an optimal form using CAD software, processing also for modern looking and check strength and then processing in rapid prototyping machine for fabrication.

Computer simulation of heterogeneous melanoma

Kłusek A.1, Dzwinel W.1

1Institute of Computer Science, AGH University of Science and Technology, Poland

Tumor proliferation is a very complex biological proces involving many spatio-temporal scales and planty of bio-chemical processes. There exist plethora of computer models, which are mimicking some of them[1]. However, the possibility to integrate in a single computational model all of known tumor growth factors remains a dream of the future. Existing models are mainly focused on the mesoscopic processes such as angiogenesis, vascularisation remodeling, and microscopic ones involving DNA and molecular scales (e.g. metabolic paths simulation). Recently, we observe a tendency to take cancer heterogeneity as one of the main factor of tumor progression[2,3]. Untill now, this aspect is not intensively investigated by using computer modelling. We present here the model of tumor dynamics, which assumes cancer heterogeneity. The model is an extension of the well known Walter and Riegel work[4]. In this paper, we present the results from heterogeneous melanoma cancer simulation, assuming various levels of genetic heterogeneities both of “driver” and “passenger” types showing its influence on tumor growth dynamics. In our model, we assume that the “driver” mutations affect the proliferation and death rates, while the “passenger” ones represent tumor “microenvironment”, which might potentially increase the phenotypic plasticity of cancer cells by deregulating gene expression networks. We demonstrate snapshots from simulation of melanoma cancer. The model was adapted on GPU boards and was implemented in CUDA environment. It allows for simulating cancers which are about 1cm ∧3 of volume with employing standard laptop with GeForce board.

Acknowledgment: This research is financed by the Polish National Center of Science (NCN) project DEC - 2013/10/M/ST6/00531

[1] Vittorio C., and Lowengrub J. Multiscale modeling of cancer: an integrated experimental and mathematical modeling approach. Cambridge University Press, 2010. pp.278.

[2] Smith C. Cancer biology: Cancer shows strength through diversity. Nature 499.7459 (2013): 505-508.

[3] Marusyk A., Almendro V., and Polyak K. Intra-tumour heterogeneity: a looking glass for cancer?. Nature Reviews Cancer 12.5 (2012): 323-334.

[4] Welter, M., and Rieger H. Physical determinants of vascular network remodeling during tumor growth. The European Physical Journal E: Soft Matter and Biological Physics 33.2 (2010): 149-163.

Cascades of consciousness in narrative communication

Grabska-Gradzińska I.1, Kulig A.2, Drożdż S.2, Kwapień J.2, Oświęcimka P.2

1Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University, Poland

2Complex Systems Theory Department, Institute of Nuclear Physics, Polish Academy of Sciences, Poland

Natural language is the field of interest of information science, physics, and science of complex systems. Especially literary texts are interesting for research, because their structure is not accidental but prepared with great care. Text organisation on different level is used for classification and description of literary movements. Nowadays especially the cognitive theory of literature combines classical literary tools with the life sciences. We study the distribution of words and sentences length characteristics. We transform the text samples into word-adjacency networks. There are created time series representing length of sentences and made analysis of its fractal properties using two methods: MFDFA and WTMM. We create separate corporas of prose, poetry and scientific texts and calculate a node degree cumulative distribution P(X ≥ k). In order to study the long-range correlations, particularly those that refer to fractals and cascading effects, we select a corpus of world-famous literary texts of considerable size (over 5000) and form a series l(j) from the lengths of the consecutive sentences j. Studying characteristics of the sentence length shows that an appealing and aesthetic optimum and involves selfsimilar, cascade-like alternation of various lengths sentences. As far as correlations in the sentence length variability are concerned, some texts - within the present corpus exclusively belonging to the stream of consciousness narrative - develop even more complex scale-free patterns of the nonlinear character. The resulting cascades manifest themselves whole spectrum of the scaling exponents as compactly grasped by the multifractal spectrum f(α), whose width reflects the degree of complexity involved.

Quantum nature consciousness and artificial consciousness

Adamski A.1

1Faculty of Ethnology and Educational Science in Cieszyn, University of Silesia in Katowice, Poland

Apart from using biochemical channels a human biological system, in order to transfer in-formation uses electromagnetic, acoustic, solition waves; electric, electromagnetic and torsion fields as well as bioplasma. Each of these channels may be a carrier of information for a biological system or it can function as a team in a bioplasma system. Biochemistry, under the influence of quantum mechanics beyond the reference to the presence of electrons, still does not see the role of photons, phonons, solitons, electromagnetic and spin fields, and bioplasma in life processes of the organism. Biochemistry recognizes only the ionic conductivity, which is moving at a speed of 120 m / sec and it is too slow to provide information needed for the regeneration of biological structures and the functioning of the perceptual and mental processes. In the bioelectronical model it is assumed that the structure of proteins, DNA, RNA and melanin is an electronic material in which there can occur various local or nonlocal quantum processes. Complex regulatory, bimolecular, cybernetic, informatics and quantum processes occur, therefore, at the level of a single cell.

Simulation of skinfold creep under the load of caliper within the poroelastic model

Nowak J.1, Kaczmarek M.1

1Institute of Mechanics and Applied Computer Science, Kazimierz Wielki University, Poland

The accumulation of lymph in tissue of trunk and arm after breast cancer surgery is caused by disturbance in lymph transport from interstitial space through lymphatic system. This work based on Robert’s et al. [1] idea to measure rate of creep of tissue with edema using modified Harpenden skinfold caliper in order to distinguish between normal and affected side. The aim of our considerations is to help understanding the creep phenomenon and creep rate parameter as a sensitive indicator of edema existence. Simulations are performed within the range of finite deformations. A simplified material characterization and fold geometry as well as a set of initial and boundary conditions are formulated. Taking into account a two-phase nature of lymphedematous tissue a poroelastic model implemented in COMSOL Multiphysics was used. The Darcy’s Law and continuity equation are assumed for fluid then the solid skeleton was described by mechanical equilibrium equation, effective stress law, and hyperelastic (Neo-Hookean) material model. The creep being the result of pore fluid displacement in tissue was simulated by finite element method. The evolutions of pore pressure, pore fluid velocity and displacements are studied. The creep rate parameter was estimated by comparison of thickness changes of the fold for two time instants (10 s and 60 s). The parametric analysis was performed to indicate the influence of crucial parameters, such as stiffness or permeability, on tissue’s hydromechanics. The results obtained showed that the shape of creep curve is sensitive to material properties of the poroelastic model and the creep measure may be useful diagnostic tool as an indicator of edema existence.

[1] Roberts C. C., Levick J. R., Stanton A.W.B., Mortimer P. S., Assessment of truncal edema following breast cancer treatment using modified harpenden skinfold calipers, Lymphology, 1995, 28:78-88.

Visual speech models for polish speech processing in noisy environments

Jadczyk T.1,2, Ziółko M.1,2,

1Department of Electronics, AGH University of Science and Technology, Poland

2Techmo sp z o.o.,, Poland

Speech recognition systems are becoming more popular and reliable nowadays. However, noise robustness is still an issue. The use of multi-modal (audio and visual) system is a one of techniques to overcome problems with recognition accuracy at low Signal-to-Noise Ratio (SNR) environments. It is based on bimodal nature of human’s speech perception. Two types of Visual Models were analyzed. Both methods require lip region (ROI) extraction at the beginning. First model is based on raw pixel intensities from ROI, transformed by 2-D Discrete Cosine Transform (DCT model). Second one utilizes positions of marked points from lip contours and pixels inside lip shape (Active Appearance Model – AAM). Two approaches for integration of audio and visual modalities were also tested: feature concatenation with dimensionality reduction (early integration) and model integration. For the second strategy we have prepared set of 12 visemes (visually distinguishable units), comparing to standard 37 phonemes used in audio-only speech recognition. Audio stream is processed with MFCC parametrization. Our system is evaluated on Audio-Visual Speech Corpus, which contains most popular phrases from polish dialogs, uttered by 24 speakers. Experiments under various clean and noisy conditions showed that using both audio and visual modalities in low SNR can reduce the Phrase Recognition Error Rate from 10% (DCT model, feature concatenation) – 15% (AAM), up to 25-35% (model fusion, respectively DCT and AAM) comparing to standard HMM-based, audio only system. Experiments showed that enhancing speech recognition system by visual modality improves robustness in difficult audio conditions, however some additional work, like speaker normalization, may be required for obtaining best results. The same setup showed that in clean audio conditions using Visual Models is no beneficial to the system.

Acknowledgement: This work was supported by Polish National Science Centre (NCN) granted by decision DEC-2011/03/N/ST7/00443

Medical data preprocessing for increased selectivity of diagnosis

Walczak A.1, Paczkowski M. 1

1Department of Cybernetics, Military University of Technology, Poland

In this paper, we present a framework, which will enable us to obtain increased accuracy of computer diagnosis in medical patient checkup. New proposition for medical data analysis has been built based on medical data preprocessing. Result of such preprocessing is transformation medical data from semantic for into parameterized math form. Model for digging of hidden data properties is presented as well. Exploration of hidden data properties creates new possibilities for medical data exploration. Diagnose selectivity increases in presented computer diagnostics system via parameterized illnesses patterns in medical database. Selectivity increase is made by means of discovering hidden properties of symptoms in medical database.

[1] De Dombal, F.T., Leaper, D.J., Staniland, J.R., McCann, A.P., & Horrocks, J.C., (1972), “Computer-aided diagnosis of acute abdominal pain,” British Medical Journal 2: 9-13

[2] Kuncheva, L.I., Whitaker, C.J., (2003) Measures of diversity in classifier ensembles. Machine Learning, Vol. 51, No. 2, str. 181–207

[3] Innovative Industry Operating Program, (POIG in Polish language), 02.03.03.-00-013/08, research program of Polish Ministry of Regional Politics.

If the inline PDF is not rendering correctly, you can download the PDF file here.


Journal + Issues

The journal Bio-Algorithms and Med-Systems provides a forum for the exchange of information in the interdisciplinary fields of computational methods applied in medicine presenting new algorithms and databases that allow the progress in collaboration between medicine, informatics, physics, and biochemistry.