Publications

3D Body Scanning and Heathcare Applications

@article{treleaven20073d,
title={3D body scanning and healthcare applications},
author={Treleaven, Philip and Wells, JCK},
journal={Computer},
volume={40},
number={7},
pages={28–34},
year={2007}
}

Developed largely for the clothing industry, 3D body-surface scanners are transforming our ability to accurately measure and visualize a person’s body size, shape, and skin-surface area. Advancements in 3D whole-body scanning seem to offer even greater potential for healthcare applications.

A Brief History of Financial Risk and Information

@article{flood2012brief,
title={A brief history of financial risk and information},
author={Flood, Mark D},
journal={Handbook of Financial Data and Risk Information},
volume={1},
year={2012}
}

This chapter presents the historical context for the current state of financial information and risk management. In lieu of a comprehensive history, the authors discuss several broad historical themes in risk and finance: institutionalization, technology, globalization, and complexity, including the rise of risk management professionals. Emblematic events are used to illustrate the evolution of the financial markets and risk management.

A Collaboratively-Derived Science-Policy Research Agenda

@article{sutherland2012collaboratively,
title={A collaboratively-derived science-policy research agenda},
author={Sutherland, William J and Bellingan, Laura and Bellingham, Jim R and Blackstock, Jason J and Bloomfield, Robert M and Bravo, Michael and Cadman, Victoria M and Cleevely, David D and Clements, Andy and Cohen, Anthony S and others},
journal={PloS one},
volume={7},
number={3},
pages={e31824},
year={2012},
publisher={Public Library of Science}
}

The need for policy makers to understand science and for scientists to understand policy processes is widely recognised. However, the science-policy relationship is sometimes difficult and occasionally dysfunctional; it is also increasingly visible, because it must deal with contentious issues, or itself becomes a matter of public controversy, or both. We suggest that identifying key unanswered questions on the relationship between science and policy will catalyse and focus research in this field. To identify these questions, a collaborative procedure was employed with 52 participants selected to cover a wide range of experience in both science and policy, including people from government, non-governmental organisations, academia and industry. These participants consulted with colleagues and submitted 239 questions. An initial round of voting was followed by a workshop in which 40 of the most important questions were identified by further discussion and voting. The resulting list includes questions about the effectiveness of science-based decision-making structures; the nature and legitimacy of expertise; the consequences of changes such as increasing transparency; choices among different sources of evidence; the implications of new means of characterising and representing uncertainties; and ways in which policy and political processes affect what counts as authoritative evidence. We expect this exercise to identify important theoretical questions and to help improve the mutual understanding and effectiveness of those working at the interface of science and policy.

A Comparative Analysis of Community Detection Algorithms on Artificial Networks

@article{Yang2016,
author = {Yang, Zhao and Algesheimer, Ren{‘{e}} and Tessone, Claudio J.},
doi = {10.1038/srep30750},
issn = {2045-2322},
journal = {Scientific Reports},
month = {aug},
pages = {30750},
title = {{A Comparative Analysis of Community Detection Algorithms on Artificial Networks}},
url = {http://www.nature.com/articles/srep30750},
volume = {6},
year = {2016}
}

Many community detection algorithms have been developed to uncover the mesoscopic properties of complex networks. However how good an algorithm is, in terms of accuracy and computing time, remains still open. Testing algorithms on real-world network has certain restrictions which made their insights potentially biased: the networks are usually small, and the underlying communities are not defined objectively. In this study, we employ the Lancichinetti-Fortunato-Radicchi benchmark graph to test eight state-of-the-art algorithms. We quantify the accuracy using complementary measures and algorithms’ computing time. Based on simple network properties and the aforementioned results, we provide guidelines that help to choose the most adequate community detection algorithm for a given network. Moreover, these rules allow uncovering limitations in the use of specific algorithms given macroscopic network properties. Our contribution is threefold: firstly, we provide actual techniques to determine which is the most suited algorithm in most circumstances based on observable properties of the network under consideration. Secondly, we use the mixing parameter as an easily measurable indicator of finding the ranges of reliability of the different algorithms. Finally, we study the dependency with network size focusing on both the algorithm’s predicting power and the effective computing time.

A Comparison of Acoustic Radiation Force Derived Indices of Cardiac Function in the Langendorff Perfused Rabbit Heart

@article{vejdani2016comparison,
title={A Comparison of Acoustic Radiation Force Derived Indices of Cardiac Function in the Langendorff Perfused Rabbit Heart},
author={Vejdani-Jahromi, Maryam and Nagle, Matt and Jiang, Yang and Trahey, Gregg E and Wolf, Patrick D},
year={2016},
publisher={IEEE}
}

In the past decade, there has been an increased interest in characterizing cardiac tissue mechanics utilizing newly developed ultrasound-based elastography techniques. These methods excite the tissue mechanically and track the response. Two frequently used methods, acoustic radiation force impulse and shear wave elasticity imaging (ARFI and SWEI), have been considered qualitative and quantitative techniques providing relative and absolute measures of tissue stiffness respectively. Depending on imaging conditions, it is desirable to identify indices of cardiac function that could be measured by ARFI and SWEI and to characterize the relationship between the measures. In this study, we have compared two indices (i.e. relaxation time constant used for diastolic dysfunction assessment and systolic/diastolic stiffness ratio) measured nearly simultaneously by M-mode ARFI and SWEI techniques. We additionally correlated ARFI-measured inverse displacements with SWEI measured values of the shear modulus of stiffness. For the eight animals studied, the average relaxation time constant (τ) measured by ARFI and SWEI were (69±18 ms, R2=0.96) and (65±19 ms, R2=0.99), (ARFI-SWEI inter-rater agreement = 0.90). Average systolic/diastolic stiffness ratio for ARFI, and SWEI measurements were 6.01±1.37 and 7.12±3.24 respectively (agreement=0.70). Shear modulus of stiffness (SWEI) was linearly related to inverse displacement values (ARFI) with a 95% CI for the slope of 0.010-0.011 (1/μm)/(kPa) (R2=0.73). In conclusion, the relaxation time constant and the systolic/diastolic stiffness ratio were calculated with good agreement between the ARFI and SWEI derived measurements. ARFI relative and SWEI absolute stiffness measurements were linearly related with varying slopes based on imaging conditions and subject tissue properties.

A complementary view on the growth of directory trees

@article{tessone2009b,
author = {Geipel, Markus M and Tessone, Claudio Juan and Schweitzer, Frank},
doi = {10.1140/epjb/e2009-00302-5},
issn = {1434-6028},
journal = {The European Physical Journal B},
month = {sep},
number = {4},
pages = {641–648},
title = {{A complementary view on the growth of directory trees}},
url = {http://www.springerlink.com/index/10.1140/epjb/e2009-00302-5},
volume = {71},
year = {2009}
}

Trees are a special sub-class of networks with unique properties, such as the level distribution which has often been overlooked. We analyse a general tree growth model proposed by Klemm {\em et. al.} (2005) to explain the growth of user-generated directory structures in computers. The model has a single parameter q which interpolates between preferential attachment and random growth. Our analysis results in three contributions: First, we propose a more efficient estimation method for q based on the degree distribution, which is one specific representation of the model. Next, we introduce the concept of a level distribution and analytically solve the model for this representation. This allows for an alternative and independent measure of q. We argue that, to capture real growth processes, the q estimations from the degree and the level distributions should coincide. Thus, we finally apply both representations to validate the model with synthetically generated tree structures, as well as with collected data of user directories. In the case of real directory structures, we show that q measured from the level distribution are incompatible with q measured from the degree distribution. In contrast to this, we find perfect agreement in the case of simulated data. Thus, we conclude that the model is an incomplete description of the growth of real directory structures as it fails to reproduce the level distribution. This insight can be generalised to point out the importance of the level distribution for modeling tree growth.

A Fistful of Bitcoins: Characterizing Payments among Men with No Names

@inproceedings{meiklejohn2013fistful,
title={A fistful of bitcoins: characterizing payments among men with no names},
author={Meiklejohn, Sarah and Pomarole, Marjori and Jordan, Grant and Levchenko, Kirill and McCoy, Damon and Voelker, Geoffrey M and Savage, Stefan},
booktitle={Proceedings of the 2013 conference on Internet measurement conference},
pages={127–140},
year={2013},
organization={ACM}
}

Bitcoin is a purely online virtual currency, unbacked by either phys- ical commodities or sovereign obligation; instead, it relies on a combination of cryptographic protection and a peer-to-peer proto- col for witnessing settlements. Consequently, Bitcoin has the un- intuitive property that while the ownership of money is implicitly anonymous, its flow is globally visible. In this paper we explore this unique characteristic further, using heuristic clustering to group Bitcoin wallets based on evidence of shared authority, and then us- ing re-identification attacks (i.e., empirical purchasing of goods and services) to classify the operators of those clusters. From this anal- ysis, we characterize longitudinal changes in the Bitcoin market, the stresses these changes are placing on the system, and the chal- lenges for those seeking to use Bitcoin for criminal or fraudulent purposes at scale.

A Model of Dynamic Rewiring and Knowledge Exchange in R{\&}D Networks

@article{Tomasello2016,
author = {Tomasello, Mario Vincenzo and Tessone, Claudio Juan and Schweitzer, Frank},
doi = {10.1142/S0219525916500041},
issn = {0219-5259},
journal = {Advances in Complex Systems},
month = {feb},
number = {01n02},
pages = {1650004},
title = {{A Model of Dynamic Rewiring and Knowledge Exchange in R{&}D Networks}},
url = {http://www.worldscientific.com/doi/10.1142/S0219525916500041},
volume = {19},
year = {2016}
}

This paper investigates the process of knowledge exchange in inter-firm Research and Development (R&D) alliances by means of an agent-based model. Extant research has pointed out that firms select alliance partners considering both network-related and network-unrelated features (e.g., social capital versus complementary knowledge stocks). In our agent-based model, firms are located in a metric knowledge space. The interaction rules incorporate an exploration phase and a knowledge transfer phase, during which firms search for a new partner and then evaluate whether they can establish an alliance to exchange their knowledge stocks. The model parameters determining the overall system properties are the rate at which alliances form and dissolve and the agents’ interaction radius. Next, we define a novel indicator of performance, based on the distance traveled by the firms in the knowledge space. Remarkably, we find that — depending on the alliance formation rate and the interaction radius — firms tend to cluster around one or more attractors in the knowledge space, whose position is an emergent property of the system. And, more importantly, we find that there exists an inverted U-shaped dependence of the network performance on both model parameters.

Read More: http://www.worldscientific.com/doi/abs/10.1142/S0219525916500041

A multiple hold-out framework for Sparse Partial Least Squares

@article{monteiro2016multiple,
title={A multiple hold-out framework for Sparse Partial Least Squares},
author={Monteiro, Jo{~a}o M and Rao, Anil and Shawe-Taylor, John and Mour{~a}o-Miranda, Janaina and Alzheimer’s Disease Initiative and others},
journal={Journal of Neuroscience Methods},
volume={271},
pages={182–194},
year={2016},
publisher={Elsevier}
}

Supervised classification machine learning algorithms may have limitations when studying brain diseases with heterogeneous populations, as the labels might be unreliable. More exploratory approaches, such as Sparse Partial Least Squares (SPLS), may provide insights into the brain’s mechanisms by finding relationships between neuroimaging and clinical/demographic data. The identification of these relationships has the potential to improve the current understanding of disease mechanisms, refine clinical assessment tools, and stratify patients. SPLS finds multivariate associative effects in the data by computing pairs of sparse weight vectors, where each pair is used to remove its corresponding associative effect from the data by matrix deflation, before computing additional pairs.

A new proof for the conditions of Novikov and Kazamaki

@article{ruf2013new,
title={A new proof for the conditions of Novikov and Kazamaki},
author={Ruf, Johannes},
journal={Stochastic Processes and Their Applications},
volume={123},
number={2},
pages={404–421},
year={2013},
publisher={Elsevier}
}

This paper provides a novel proof for the sufficiency of certain well-known criteria that guarantee the martingale property of a continuous, nonnegative local martingale. More precisely, it is shown that generalizations of Novikov’s condition and Kazamaki’s criterion follow directly from the existence of Fo ̈llmer’s measure. This approach allows to extend well-known criteria of martingality from strictly positive to only nonnegative, continuous local martingales.
Keywords: Local martingale; stochastic exponential; Fo ̈llmer’s measure; uniform integrability; lower function; Bessel process

A nonlinear impact: evidences of causal effects of social media on market prices

### Example ###
@Article{aste2,
Title = {A nonlinear impact: evidences of causal effects of social media on market prices},
Author = {Aste Tomaso and Souza Tharsis},
Journal = {},
Year = {2016},
}

We provide empirical evidence that suggests social media and stock markets have a nonlinear causal relationship. We take advantage of an extensive data set composed of social media messages related to DJIA index components. By using information-theoretic measures to cope for possible nonlinear causal coupling between social media and stock markets systems, we point out stunning differences in the results with respect to linear coupling. Two main conclusions are drawn: First, social media significant causality on stocks’ returns are purely nonlinear in most cases; Second, social media dominates the directional coupling with stock market, an effect not observable within linear modeling. Results also serve as empirical guidance on model adequacy in the investigation of sociotechnical and financial systems.

A novel application of particle swarm optimisation to optimal trade execution

@inproceedings{saeidi2013novel,
title={A Novel Application of Particle Swarm Optimisation to Optimal Trade Execution},
author={Saeidi, Marzieh and Gorse, Denise},
booktitle={International Conference on Neural Information Processing},
pages={448–455},
year={2013},
organization={Springer}
}

Particle swarm optimisation (PSO) is applied for the first time to the problem of optimal trade execution, which aims to partition a large trade so as to minimise hidden costs, here specifically a combination of market impact and opportunity risk. A large order is divided into a set of smaller ones, with both the length of time these remain open and the proportion of the original order they represent being subject to optimisation. It is found that the proposed method can equal the performance of the very popular volume-based VWAP method without in our case having access to trading volume information.

A novel method of determining events in combination gas boilers: Assessing the feasibility of a passive acoustic sensor

@article{neeld2016novel,
title={A novel method of determining events in combination gas boilers: Assessing the feasibility of a passive acoustic sensor},
author={Neeld, Thomas and Eaton, James and Naylor, Patrick A and Shipworth, David},
journal={Building and Environment},
volume={100},
pages={1–9},
year={2016},
publisher={Elsevier}
}

To assess the impact of interventions designed to reduce residential space heating demand, investigators must be armed with field-trial applicable techniques that accurately measure space heating energy use. This study assesses the feasibility of using a passive acoustic sensor to detect gas consumption events in domestic combination gas-fired boilers (C-GFBs). The investigation has shown, for the C-GFB investigated, the following events are discernible using a passive acoustic sensor: demand type (hot water or central heating); boiler ignition time; and pre-mix fan motor speed. A detection algorithm was developed to automatically identify demand type and burner ignition time with accuracies of 100% and 97% respectfully. Demand type was determined by training a naive Bayes classifier on 20 features of the acoustic profile at the start of a demand event. Burner ignition was determined by detecting low frequency (5–10 Hz) pressure pulsations produced during ignition. The acoustic signatures of the pre-mix fan and circulation-pump were identified manually. Additional work is required to detect burner duration, deal with detection in the presence of increased noise and expand the range of boilers investigated. There are considerable implications resulting from the widespread use of such techniques on improving understanding of space heating demand.

A one-dimensional diffusion hits points fast

@article{bruggeman2016one,
title={A one-dimensional diffusion hits points fast},
author={Bruggeman, Cameron and Ruf, Johannes and others},
journal={Electronic Communications in Probability},
volume={21},
year={2016},
publisher={The Institute of Mathematical Statistics and the Bernoulli Society}
}

A one-dimensional, continuous, regular, and strong Markov process X with state space E hits any point z ∈ E fast with positive probability. To wit, if τz = inf{t ≥ 0 : Xt = z}, thenPξ(τz <ε)>0forallξ∈Eandε>0.
Keywords: diffusion; hitting time; support.
AMS MSC 2010: 60J60.
Submitted to ECP on September 8, 2015, final version accepted on February 25, 2016.

A Profitable Sub-Prime Loan: Obtaining the Advantages of Composite Order in Prime-Order Bilinear Groups

@article{lewko2013profitable,
title={A Profitable Sub-Prime Loan: Obtaining the Advantages of Composite-Order in Prime-Order Bilinear Groups.},
author={Lewko, Allison B and Meiklejohn, Sarah},
journal={IACR Cryptology ePrint Archive},
volume={2013},
pages={300},
year={2013},
publisher={Citeseer}
}

Composite-order bilinear groups provide many structural features that are useful for both con- structing cryptographic primitives and enabling security reductions. Despite these convenient fea- tures, however, composite-order bilinear groups are less desirable than prime-order bilinear groups for reasons of both efficiency and security. A recent line of work has therefore focused on translating these structural features from the composite-order to the prime-order setting; much of this work fo- cused on two such features, projecting and canceling, in isolation, but a result due to Seo and Cheon showed that both features can be obtained simultaneously in the prime-order setting.
In this paper, we reinterpret the construction of Seo and Cheon in the context of dual pairing vector spaces (which provide canceling as well as useful parameter hiding features) to obtain a unified framework that simulates all of these composite-order features in the prime-order setting. We demonstrate the strength of this framework by providing two applications: one that adds dual pairing vector spaces to the existing projection in the Boneh-Goh-Nissim encryption scheme to obtain leakage resilience, and another that adds the concept of projecting to the existing dual pairing vector spaces in an IND-CPA-secure IBE scheme to “boost” its security to IND-CCA1. Our leakage-resilient BGN application is of independent interest, and it is not clear how to achieve it from pure composite-order techniques without mixing in additional vector space tools. Both applications rely solely on the Symmetric External Diffie Hellman assumption (SXDH).