Publications
This is a complete list of publications I have authored or co-authored. You can find my h-index and i10-index information at Google Scholar. Clicking on the text of an entry will show you the BibTeX citation for such an entry, together with a download link, if one is available.
2024
Dilermando Queiroz Neto, Anderson Carlos, Ma\'ıra Fatoretto, Luis Filipe Nakayama, André Anjos, and Lilian Berton. Does data-efficient generalization exacerbate bias in foundation models? In Proceedings of the 18th European Conference on Computer Vision (ECCV). October 2024.
Article
@inproceedings{eccv-2024,
author = "Queiroz Neto, Dilermando and Carlos, Anderson and Fatoretto, Ma{\'{\i}}ra and Nakayama, Luis Filipe and Anjos, Andr{\'{e}} and Berton, Lilian",
projects = "FAIRMI",
month = "October",
title = "Does Data-Efficient Generalization Exacerbate Bias in Foundation Models?",
booktitle = "Proceedings of the 18th European Conference on Computer Vision (ECCV)",
year = "2024",
abstract = "Foundation models have emerged as robust models with label efficiency in diverse domains. In medical imaging, these models contribute to the advancement of medical diagnoses due to the difficulty in obtaining labeled data. However, it is unclear whether using a large amount of unlabeled data, biased by the presence of sensitive attributes during pre-training, influences the fairness of the model. This research examines the bias in the Foundation model (RetFound) when it is applied to fine-tune the Brazilian Multilabel Ophthalmological Dataset (BRSET), which has a different population than the pre-training dataset. The model evaluation, in comparison with supervised learning, shows that the Foundation Model has the potential to reduce the gap between the maximum AUC and minimum AUC evaluations across gender and age groups. However, in a data-efficient generalization, the model increases the bias when the data amount decreases. These findings suggest that when deploying a Foundation Model in real-life scenarios with limited data, the possibility of fairness issues should be considered.",
pdf = "https://publications.idiap.ch/attachments/papers/2024/QueirozNeto\_ECCV\_2024.pdf"
}
Dilermando Queiroz Neto, André Anjos, and Lilian Berton. Using backbone foundation model for evaluating fairness in chest radiography without demographic data. In Proceedings of the International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI). October 2024.
Article
@inproceedings{miccai-2024,
author = "Queiroz Neto, Dilermando and Anjos, Andr{\'{e}} and Berton, Lilian",
keywords = "Fairness, Foundation Model, Medical Image",
month = "October",
title = "Using Backbone Foundation Model for Evaluating Fairness in Chest Radiography Without Demographic Data",
booktitle = "Proceedings of the International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI)",
year = "2024",
abstract = "Ensuring consistent performance across diverse populations and incorporating fairness into machine learning models are crucial for advancing medical image diagnostics and promoting equitable healthcare. However, many databases do not provide protected attributes or contain unbalanced representations of demographic groups, complicating the evaluation of model performance across different demographics and the application of bias mitigation techniques that rely on these attributes. This study aims to investigate the effectiveness of using the backbone of Foundation Models as an embedding extractor for creating groups that represent protected attributes, such as gender and age. We propose utilizing these groups in different stages of bias mitigation, including pre-processing, in-processing, and evaluation. Using databases in and out-of-distribution scenarios, it is possible to identify that the method can create groups that represent gender in both databases and reduce in 4.44\\% the difference between the gender attribute in-distribution and 6.16\\% in out-of-distribution. However, the model lacks robustness in handling age attributes, underscoring the need for more fundamentally fair and robust Foundation models. These findings suggest a role in promoting fairness assessment in scenarios where we lack knowledge of attributes, contributing to the development of more equitable medical diagnostics.",
pdf = "https://publications.idiap.ch/attachments/papers/2024/QueirozNeto\_CVPR\_2024.pdf"
}
Özgür Güler, Manuel Günther, and André Anjos. Refining tuberculosis detection in cxr imaging: addressing bias in deep neural networks via interpretability. In Proceedings of the 12th European Workshop on Visual Information Processing. September 2024.
Article
@inproceedings{euvip-2024-1,
author = {G{\"{u}}ler, {\"{O}}zg{\"{u}}r and G{\"{u}}nther, Manuel and Anjos, Andr{\'{e}}},
month = "September",
title = "Refining Tuberculosis Detection in CXR Imaging: Addressing Bias in Deep Neural Networks via Interpretability",
booktitle = "Proceedings of the 12th European Workshop on Visual Information Processing",
year = "2024",
abstract = "Automatic classification of active tuberculosis from chest X-ray images has the potential to save lives, especially in low- and mid-income countries where skilled human experts can be scarce. Given the lack of available labeled data to train such systems and the unbalanced nature of publicly available datasets, we argue that the reliability of deep learning models is limited, even if they can be shown to obtain perfect classification accuracy on the test data. One way of evaluating the reliability of such systems is to ensure that models use the same regions of input images for predictions as medical experts would. In this paper, we show that pre-training a deep neural network on a large-scale proxy task, as well as using mixed objective optimization network (MOON), a technique to balance different classes during pre-training and fine-tuning, can improve the alignment of decision foundations between models and experts, as compared to a model directly trained on the target dataset. At the same time, these approaches keep perfect classification accuracy according to the area under the receiver operating characteristic curve (AUROC) on the test set, and improve generalization on an independent, unseen dataset. For the purpose of reproducibility, our source code is made available online.",
pdf = "https://publications.idiap.ch/attachments/papers/2024/Guler\_EUVIP24\_2024.pdf"
}
Oscar Jimenez-del-Toro, Christoph Aberle, Roger Schaer, Michael Bach, Kyriakos Flouris, Ender Konukoglu, Bram Stieltjes, Markus M. Obmann, André Anjos, Henning Müller, and Adrien Depeursinge. Comparing stability and discriminatory power of hand-crafted versus deep radiomics: a 3d-printed anthropomorphic phantom study. In Proceedings of the 12th European Workshop on Visual Information Processing. September 2024.
Article
@inproceedings{euvip-2024-2,
author = {Jimenez-del-Toro, Oscar and Aberle, Christoph and Schaer, Roger and Bach, Michael and Flouris, Kyriakos and Konukoglu, Ender and Stieltjes, Bram and Obmann, Markus M. and Anjos, Andr{\'{e}} and M{\"{u}}ller, Henning and Depeursinge, Adrien},
month = "September",
title = "Comparing Stability and Discriminatory Power of Hand-crafted Versus Deep Radiomics: A 3D-Printed Anthropomorphic Phantom Study",
booktitle = "Proceedings of the 12th European Workshop on Visual Information Processing",
year = "2024",
abstract = "Radiomics have the ability to comprehensively quantify human tissue characteristics in medical imaging studies. However, standard radiomic features are highly unstable due to their sensitivity to scanner and reconstruction settings. We present an evaluation framework for the extraction of 3D deep radiomics features using a pre-trained neural network on real computed tomography (CT) scans for tissue characterization. We compare both the stability and discriminative power of the proposed 3D deep learning radiomic features versus standard hand-crafted radiomic features using 8 image acquisition protocols with a 3D-printed anthropomorphic phantom containing 4 classes of liver lesions and normal tissue. Even when the deep learning model was trained on an external dataset and for a different tissue characterization task, the resulting generic deep radiomics are at least twice more stable on 8 CT parameter variations than any category of hand-crafted features. Moreover, the 3D deep radiomics were also discriminative for the tissue characterization between 4 classes of liver tissue and lesions, with an average discriminative power of 93.5\\%.",
pdf = "https://publications.idiap.ch/attachments/papers/2024/Jimenez-del-Toro\_EUVIP2024\_2024.pdf"
}
Victor Amiot, Oscar Jimenez-del-Toro, Yan Guex-Croisier, Muriel Ott, Teodora-Elena Bogaciu, Shalini Banerjee, Jeremy Howell, Christoph Amstutz, Christophe Chiquet, Ciara Bergin, Ilenia Meloni, Mattia Tomasoni, Florence Hoogewoud, and André Anjos. Automatic transformer-based grading of multiple retinal inflammatory signs on fluorescein angiography. September 2024. URL: https://papers.ssrn.com/abstract=4960069, doi:10.2139/ssrn.4960069.
@misc{ssrn-2024,
author = "Amiot, Victor and Jimenez-del-Toro, Oscar and Guex-Croisier, Yan and Ott, Muriel and Bogaciu, Teodora-Elena and Banerjee, Shalini and Howell, Jeremy and Amstutz, Christoph and Chiquet, Christophe and Bergin, Ciara and Meloni, Ilenia and Tomasoni, Mattia and Hoogewoud, Florence and Anjos, André",
title = "Automatic Transformer-Based Grading of Multiple Retinal Inflammatory Signs on Fluorescein Angiography",
url = "https://papers.ssrn.com/abstract=4960069",
doi = "10.2139/ssrn.4960069",
abstract = "Background: Grading fluorescein angiography ({FA}) in the context of uveitis is complex, often leading to the oversight of retinal inflammation in clinical studies. This study aims to develop an automated method for grading retinal inflammation.",
number = "4960069",
year = "2024",
month = "September",
day = "24",
keywords = "capillaropathy, Deep Learning, disease grading, fluorescein angiography, inter-grader agreement, macular edema, optic disc hyperfluorescence, ordinal classification index, papillitis, retinal inflammation, transformers, Uveitis, vascular leakage, vasculitis"
}
Thibaud Mautuit, Pierre Cunnac, Frédéric Truffer, André Anjos, Rebecca Dufrane, Gilbert Ma\^ıtre, Martial Geiser, and Christophe Chiquet. Absolute retinal blood flow in healthy eyes and in eyes with retinal vein occlusion. Microvascular Research, January 2024. doi:10.1016/j.mvr.2023.104648.
@article{mvr-2024,
author = "Mautuit, Thibaud and Cunnac, Pierre and Truffer, Fr{\'{e}}d{\'{e}}ric and Anjos, Andr{\'{e}} and Dufrane, Rebecca and Ma{\^{\i}}tre, Gilbert and Geiser, Martial and Chiquet, Christophe",
month = "January",
title = "Absolute retinal blood flow in healthy eyes and in eyes with retinal vein occlusion",
journal = "Microvascular Research",
volume = "152",
year = "2024",
issn = "0026-2862",
doi = "10.1016/j.mvr.2023.104648",
abstract = "Purpose: To measure non-invasively retinal venous blood flow (RBF) in healthy subjects and patients with retinal venous occlusion (RVO). Methods: The prototype named AO-LDV (Adaptive Optics Laser Doppler Velocimeter), which combines a new absolute laser Doppler velocimeter with an adaptive optics fundus camera (rtx1, Imagine Eyes{\textregistered}, Orsay, France), was studied for the measurement of absolute RBF as a function of retinal vessel diameters and simultaneous measurement of red blood cell velocity. RBF was measured in healthy subjects (n = 15) and patients with retinal venous occlusion (RVO, n = 6). We also evaluated two softwares for the measurement of retinal vessel diameters: software 1 (automatic vessel detection, profile analysis) and software 2 (based on the use of deep neural networks for semantic segmentation of vessels, using a M2u-Net architecture). Results: Software 2 provided a higher rate of automatic retinal vessel measurement (99.5 \\% of 12,320 AO images) than software 1 (64.9 \\%) and wider measurements (75.5 ± 15.7 μm vs 70.9 ± 19.8 μm, p < 0.001). For healthy subjects (n = 15), all the retinal veins in one eye were measured to obtain the total RBF. In healthy subjects, the total RBF was 37.8 ± 6.8 μl/min. There was a significant linear correlation between retinal vessel diameter and maximal velocity (slope = 0.1016; p < 0.001; r2 = 0.8597) and a significant power curve correlation between retinal vessel diameter and blood flow (3.63 × 10−5 × D2.54; p < 0.001; r2 = 0.7287). No significant relationship was found between total RBF and systolic and diastolic blood pressure, ocular perfusion pressure, heart rate, or hematocrit. For RVO patients (n = 6), a significant decrease in RBF was noted in occluded veins (3.51 ± 2.25 μl/min) compared with the contralateral healthy eye (11.07 ± 4.53 μl/min). For occluded vessels, the slope between diameter and velocity was 0.0195 (p < 0.001; r2 = 0.6068) and the relation between diameter and flow was Q = 9.91 × 10−6 × D2.41 (p < 0.01; r2 = 0.2526). Conclusion: This AO-LDV prototype offers new opportunity to study RBF in humans and to evaluate treatment in retinal vein diseases."
}
2023
Victor Amiot, Oscar Jimenez-del-Toro, Pauline Eyraud, Yan Guex-Crosier, Ciara Bergin, André Anjos, Florence Hoogewoud, and Mattia Tomasoni. Fully automatic grading of retinal vasculitis on fluorescein angiography time-lapse from real-world data in clinical settings. In 2023 IEEE 36th International Symposium on Computer-Based Medical Systems (CBMS). June 2023. doi:10.1109/CBMS58004.2023.00301.
@inproceedings{cbms-2023,
author = "Amiot, Victor and Jimenez-del-Toro, Oscar and Eyraud, Pauline and Guex-Crosier, Yan and Bergin, Ciara and Anjos, André and Hoogewoud, Florence and Tomasoni, Mattia",
title = "Fully Automatic Grading of Retinal Vasculitis on Fluorescein Angiography Time-lapse from Real-world Data in Clinical Settings",
booktitle = "2023 IEEE 36th International Symposium on Computer-Based Medical Systems (CBMS)",
year = "2023",
month = "June",
doi = "10.1109/CBMS58004.2023.00301",
abstract = "The objective of this study is to showcase a pipeline able to perform fully automated grading of retinal inflammation based on a standardised, clinically-validated grading scale. The application of such scale has so far been hindered by the the amount of time required to (manually) apply it in clinical settings. Our dataset includes 3,205 fluorescein angiography images from 148 patients and 242 eyes from the uveitis department of Jules Gonin Eye Hospital. The data was automatically extracted from a medical device, in hospital settings. Images were graded by a medical expert. We focused specifically on one type of inflammation, namely retinal vasculitis. Our pipeline comprises both learning-based models (Pasa model with F1 score = 0.81, AUC = 0.86), and an intensity-based approach to serve as a baseline (F1 score = 0.57, AUC = 0.66). A recall of up to 0.833 computed in an independent test set is comparable to the scores obtained by available state-of-the-art approaches. Here we present the first fully automated pipeline for the grading of retinal vasculitis from raw medical images that is applicable to a real-world clinical data."
}
C. Geric, Z. Z. Qin, C. M. Denkinger, S. V. Kik, B. Marais, André Anjos, P.-M. David, F. A. Khan, and A. Trajman. The rise of artificial intelligence reading of chest x-rays for enhanced tb diagnosis and elimination. INT J TUBERC LUNG DIS, May 2023. doi:10.5588/ijtld.22.0687.
@article{ijtld-2023,
author = "Geric, C. and Qin, Z. Z. and Denkinger, C. M. and Kik, S. V. and Marais, B. and Anjos, André and David, P.-M. and Khan, F. A. and Trajman, A.",
title = "The rise of artificial intelligence reading of chest X-rays for enhanced TB diagnosis and elimination",
doi = "10.5588/ijtld.22.0687",
abstract = "We provide an overview of the latest evidence on computer-aided detection (CAD) software for automated interpretation of chest radiographs (CXRs) for TB detection. CAD is a useful tool that can assist in rapid and consistent CXR interpretation for TB. CAD can achieve high sensitivity TB detection among people seeking care with symptoms of TB and in population-based screening, has accuracy on-par with human readers. However, implementation challenges remain. Due to diagnostic heterogeneity between settings and sub-populations, users need to select threshold scores rather than use pre-specified ones, but some sites may lack the resources and data to do so. Efficient standardisation is further complicated by frequent updates and new CAD versions, which also challenges implementation and comparison. CAD has not been validated for TB diagnosis in children and its accuracy for identifying non-TB abnormalities remains to be evaluated. A number of economic and political issues also remain to be addressed through regulation for CAD to avoid furthering health inequities. Although CAD-based CXR analysis has proven remarkably accurate for TB detection in adults, the above issues need to be addressed to ensure that the technology meets the needs of high-burden settings and vulnerable sub-populations.",
journal = "INT J TUBERC LUNG DIS",
volume = "27",
journaltitle = "International Journal of Tuberculosis and Lung Diseases",
year = "2023",
month = "May",
keywords = "computer-aided detection; chest radiology; pulmonary disease; tuberculosis; AI technology"
}
2022
Geoffrey Raposo, Anete Trajman, and André Anjos. Pulmonary tuberculosis screening from radiological signs on chest x-ray images using deep models. In Union World Conference on Lung Health. The Union, November 2022.
@inproceedings{union-2022,
author = "Raposo, Geoffrey and Trajman, Anete and Anjos, Andr{\'{e}}",
month = "November",
title = "Pulmonary Tuberculosis Screening from Radiological Signs on Chest X-Ray Images Using Deep Models",
booktitle = "Union World Conference on Lung Health",
year = "2022",
addendum = "(Issued from master thesis supervision)",
date = "2022-11-01",
organization = "The Union",
abstract = "Background: The World Health Organization has recently recommended the use of computer-aided detection (CAD) systems for screening pulmonary tuberculosis (PT) in Chest X-Ray images. Previous CAD models are based on direct image to probability detection techniques - and do not generalize well (from training to validation databases). We propose a method that overcomes these limitations by using radiological signs as intermediary proxies for PT detection. Design/Methods: We developed a multi-class deep learning model, mapping images to 14 radiological signs such as cavities, infiltration, nodules, and fibrosis, using the National Institute of Health (NIH) CXR14 dataset, which contains 112,120 images. Using three public PTB datasets (Montgomery County - MC, Shenzen - CH, and Indian - IN), summing up 955 images, we developed a second model mapping F probabilities to PTB diagnosis (binary labels). We evaluated this approach for its generalization capabilities against direct models, learnt directly from PTB training data or by transfer learning via cross-folding and cross-database experiments. The area under the specificity vs. sensitivity curve (AUC) considering all folds was used to summarize the performance of each approach. Results: The AUC for intra-dataset tests baseline direct detection deep models achieved 0.95 (MC), 0.95 (CH) and 0.91 (IN), with up to 35\\% performance drop on a cross-dataset evaluation scenario. Our proposed approach achieved AUC of 0.97 (MC), 0.90 (CH), and 0.93 (IN), with at most 11\\% performance drop on a cross-dataset evaluation (Table/figures). In most tests, the difference was less than 5\\%. Conclusions: A two-step CAD model based on radiological signs offers an adequate base for the development of PT screening systems and is more generalizable than a direct model. Unlike commercially available CADS, our model is completely reproducible and available open source at https://pypi.org/project/bob.med.tb/."
}
Meysam Shamsi, Anthony Larcher, Loic Barrault, Sylvain Meignier, Yevheni Prokopalo, Marie Tahon, Ambuj Mehrish, Simon Petitrenaud, Olivier Galibert, Samuel Gaist, André Anjos, Sebastien Marcel, and Marta R. Costa-jussà. Towards lifelong human assisted speaker diarization. Computer Speech & Language, July 2022. doi:10.1016/j.csl.2022.101437.
Article
@article{elsevier-csal-2022,
author = "Shamsi, Meysam and Larcher, Anthony and Barrault, Loic and Meignier, Sylvain and Prokopalo, Yevheni and Tahon, Marie and Mehrish, Ambuj and Petitrenaud, Simon and Galibert, Olivier and Gaist, Samuel and Anjos, André and Marcel, Sebastien and Costa-jussà, Marta R.",
title = "Towards lifelong human assisted speaker diarization",
issn = "0885-2308",
doi = "10.1016/j.csl.2022.101437",
abstract = "This paper introduces the resources necessary to develop and evaluate human assisted lifelong learning speaker diarization systems. It describes the {ALLIES} corpus and associated protocols, especially designed for diarization of a collection audio recordings across time. This dataset is compared to existing corpora and the performances of three baseline systems, based on x-vectors, i-vectors and {VBxHMM}, are reported for reference. Those systems are then extended to include an active correction process that efficiently guides a human annotator to improve the automatically generated hypotheses. An open-source simulated human expert is provided to ensure reproducibility of the human assisted correction process and its fair evaluation. An exhaustive evaluation, of the human assisted correction shows the high potential of this approach. The {ALLIES} corpus, a baseline system including the active correction module and all evaluation tools are made freely available to the scientific community.",
journal = "Computer Speech \\& Language",
journaltitle = "Computer Speech \\& Language",
year = "2022",
month = "July",
date = "2022-07-27",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/elsevier-csal-2022.pdf",
keywords = "Evaluation, Human assisted learning, Lifelong learning, Speaker diarization"
}
Adrian Galdran, André Anjos, José Dolz, Hadi Chakor, Hervé Lombaert, and Ismail Ben Ayed. State-of-the-art retinal vessel segmentation with minimalistic models. Nature Scientific Reports, 12(1):6174, April 2022. Number: 1 Publisher: Nature Publishing Group. URL: https://www.nature.com/articles/s41598-022-09675-y, doi:10.1038/s41598-022-09675-y.
Article
@article{nsr-2022,
author = "Galdran, Adrian and Anjos, André and Dolz, José and Chakor, Hadi and Lombaert, Hervé and Ayed, Ismail Ben",
title = "State-of-the-art retinal vessel segmentation with minimalistic models",
volume = "12",
rights = "2022 The Author(s)",
issn = "2045-2322",
url = "https://www.nature.com/articles/s41598-022-09675-y",
pdf = "https://www.nature.com/articles/s41598-022-09675-y.pdf",
doi = "10.1038/s41598-022-09675-y",
abstract = "The segmentation of retinal vasculature from eye fundus images is a fundamental task in retinal image analysis. Over recent years, increasingly complex approaches based on sophisticated Convolutional Neural Network architectures have been pushing performance on well-established benchmark datasets. In this paper, we take a step back and analyze the real need of such complexity. We first compile and review the performance of 20 different techniques on some popular databases, and we demonstrate that a minimalistic version of a standard U-Net with several orders of magnitude less parameters, carefully trained and rigorously evaluated, closely approximates the performance of current best techniques. We then show that a cascaded extension (W-Net) reaches outstanding performance on several popular datasets, still using orders of magnitude less learnable weights than any previously published work. Furthermore, we provide the most comprehensive cross-dataset performance analysis to date, involving up to 10 different databases. Our analysis demonstrates that the retinal vessel segmentation is far from solved when considering test images that differ substantially from the training data, and that this task represents an ideal scenario for the exploration of domain adaptation techniques. In this context, we experiment with a simple self-labeling strategy that enables moderate enhancement of cross-dataset performance, indicating that there is still much room for improvement in this area. Finally, we test our approach on Artery/Vein and vessel segmentation from {OCTA} imaging problems, where we again achieve results well-aligned with the state-of-the-art, at a fraction of the model complexity available in recent literature. Code to reproduce the results in this paper is released.",
addendum = "(Issued from internship supervision)",
pages = "6174",
number = "1",
journal = "Nature Scientific Reports",
journaltitle = "Scientific Reports",
shortjournal = "Sci Rep",
year = "2022",
month = "April",
date = "2022-04-13",
langid = "english",
note = "Number: 1 Publisher: Nature Publishing Group",
keywords = "Biomedical engineering, Computer science, Machine learning"
}
2021
Matheus A. Renzo, Natália Fernandez, André A. Baceti, Natanael Nunes Moura Junior, and André Anjos. Development of a lung segmentation algorithm for analog imaged chest x-ray: preliminary results. In Anais do 15. Congresso Brasileiro de Inteligência Computacional, 1–8. SBIC, October 2021. URL: https://sbic.org.br/eventos/cbic_2021/cbic2021-123/, doi:10.21528/CBIC2021-123.
Article
@inproceedings{cbic-2021,
author = "Renzo, Matheus A. and Fernandez, Natália and Baceti, André A. and Moura Junior, Natanael Nunes and Anjos, André",
title = "Development of a lung segmentation algorithm for analog imaged chest X-Ray: preliminary results",
url = "https://sbic.org.br/eventos/cbic\_2021/cbic2021-123/",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/cbic-2021.pdf",
doi = "10.21528/CBIC2021-123",
shorttitle = "Development of a lung segmentation algorithm for analog imaged chest X-Ray",
addendum = "(Issued from internship supervision)",
abstract = "Analog X-Ray radiography is still used in many underdeveloped regions around the world. To allow these populations to benefit from advances in automatic computer-aided detection (CAD) systems, X-Ray films must be digitized. Unfortunately, this procedure may introduce imaging artefacts, which may severely impair the performance of such systems. This work investigates the impact digitized images may cause to deep neural networks trained for lung (semantic) segmentation on digital x-ray samples. While three public datasets for lung segmentation evaluation exist for digital samples, none are available for digitized data. To this end, a U-Net-style architecture was trained on publicly available data, and used to predict lung segmentation on a newly annotated set of digitized images. Using typical performance metrics such as the area under the precision-recall curve (AUPRC), our results show that the model is capable to identify lung regions at digital X-Rays with a high intra-dataset (AUPRC: 0.99), and cross-dataset (AUPRC: 0.99) efficiency on unseen test data. When challenged against digitized data, the performance is substantially degraded (AUPRC: 0.90). Our analysis also suggests that typical performance markers, maximum F1 score and AUPRC, seems not to be informative to characterize segmentation problems in test images. For this goal pixels does not have independence due to natural connectivity of lungs in images, this implies that a lung pixel tends to be surrounded by other lung pixels. This work is reproducible. Source code, evaluation protocols and baseline results are available at: https://pypi.org/project/bob.ip.binseg/.",
eventtitle = "Congresso Brasileiro de Inteligência Computacional",
pages = "1--8",
booktitle = "Anais do 15. Congresso Brasileiro de Inteligência Computacional",
year = "2021",
month = "October",
publisher = "{SBIC}"
}
2020
Adrian Galdran, André Anjos, José Dolz, Hadi Chakor, Hervé Lombaert, and Ismail Ben Ayed. The little w-net that could: state-of-the-art retinal vessel segmentation with minimalistic models. September 2020. URL: https://arxiv.org/abs/2009.01907, arXiv:2009.01907, doi:10.48550/arXiv.2009.01907.
Article
@misc{arxiv-2020,
author = "Galdran, Adrian and Anjos, André and Dolz, José and Chakor, Hadi and Lombaert, Hervé and Ayed, Ismail Ben",
title = "The Little W-Net That Could: State-of-the-Art Retinal Vessel Segmentation with Minimalistic Models",
year = "2020",
month = "September",
doi = "10.48550/arXiv.2009.01907",
eprinttype = "arxiv",
eprint = "2009.01907",
archivePrefix = "arXiv",
primaryClass = "cs.CV",
journaltitle = "{arXiv}:2009.01907 [cs, eess] (submitted to Nature Scientific Reports)",
url = "https://arxiv.org/abs/2009.01907",
pdf = "https://arxiv.org/pdf/2009.01907",
abstract = "The segmentation of the retinal vasculature from eye fundus images represents one of the most fundamental tasks in retinal image analysis. Over recent years, increasingly complex approaches based on sophisticated Convolutional Neural Network architectures have been slowly pushing performance on well-established benchmark datasets. In this paper, we take a step back and analyze the real need of such complexity. Specifically, we demonstrate that a minimalistic version of a standard UNet with several orders of magnitude less parameters, carefully trained and rigorously evaluated, closely approximates the performance of current best techniques. In addition, we propose a simple extension, dubbed W-Net, which reaches outstanding performance on several popular datasets, still using orders of magnitude less learnable weights than any previously published approach. Furthermore, we provide the most comprehensive cross-dataset performance analysis to date, involving up to 10 different databases. Our analysis demonstrates that the retinal vessel segmentation problem is far from solved when considering test images that differ substantially from the training data, and that this task represents an ideal scenario for the exploration of domain adaptation techniques. In this context, we experiment with a simple self-labeling strategy that allows us to moderately enhance cross-dataset performance, indicating that there is still much room for improvement in this area. Finally, we also test our approach on the Artery/Vein segmentation problem, where we again achieve results well-aligned with the state-of-the-art, at a fraction of the model complexity in recent literature. All the code to reproduce the results in this paper is released."
}
Ana Cláudia Barbosa Honório Ferreira, Danton Diego Ferreira, Henrique Ceretta Oliveira, Igor Carvalho de Resende, André Anjos, and Maria Helena Baena de Moraes Lopes. Competitive neural layer-based method to identify people with high risk for diabetic foot. Computers in Biology and Medicine, May 2020. URL: https://www.sciencedirect.com/science/article/pii/S0010482520301244, doi:10.1016/j.compbiomed.2020.103744.
Article
@article{compbiomed-2020,
author = "Ferreira, Ana Cl\'audia Barbosa Hon\'orio and Ferreira, Danton Diego and Oliveira, Henrique Ceretta and Resende, Igor Carvalho de and Anjos, Andr\'e and Lopes, Maria Helena Baena de Moraes",
title = "Competitive neural layer-based method to identify people with high risk for diabetic foot",
volume = "120",
url = "https://www.sciencedirect.com/science/article/pii/S0010482520301244",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/compbiomed-2020.pdf",
doi = "10.1016/j.compbiomed.2020.103744",
abstract = "Background and objective: To automatically identify patients with diabetes mellitus (DM) who have high risk of developing diabetic foot, via an unsupervised machine learning technique. Methods: We collected a new database containing 54 known risk factors from 250 patients diagnosed with diabetes mellitus. The database also contained a separate validation cohort composed of 73 subjects, where the perceived risk was annotated by expert nurses. A competitive neuron layer-based method was used to automatically split training data into two risk groups. Results: We found that one of the groups was composed of patients with higher risk of developing diabetic foot. The dominant variables that described group membership via our method agreed with the findings from other studies, and indicated a greater risk for developing such a condition. Our method was validated on the available test data, reaching 71\\% sensitivity, 100\\% specificity, and 90\\% accuracy. Conclusions Unsupervised learning may be deployed to screen patients with diabetes mellitus, pointing out high-risk individuals who require priority follow-up in the prevention of diabetic foot with very high accuracy. The proposed method is automatic and does not require clinical examinations to perform risk assessment, being solely based on the information of a questionnaire answered by patients. Our study found that discriminant variables for predicting risk group membership are highly correlated with expert opinion.",
journal = "Computers in Biology and Medicine",
month = "May",
year = "2020",
keywords = "Artificial neural network, Diabetes mellitus, Diabetic foot"
}
2019
Tiago de Freitas Pereira, André Anjos, and Sébastien Marcel. Heterogeneous face recognition using domain specific units. IEEE Transactions on Information Forensics and Security, December 2019. URL: https://publications.idiap.ch/index.php/publications/show/3963, doi:10.1109/TIFS.2018.2885284.
Article
@article{tifs-2019,
author = "de Freitas Pereira, Tiago and Anjos, André and Marcel, Sébastien",
month = "December",
title = "Heterogeneous Face Recognition Using Domain Specific Units",
journal = "IEEE Transactions on Information Forensics and Security",
year = "2019",
addendum = "(Issued from Ph.D co-supervision)",
doi = "10.1109/TIFS.2018.2885284",
url = "https://publications.idiap.ch/index.php/publications/show/3963",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tifs-2018.pdf",
abstract = "The task of Heterogeneous Face Recognition consists in matching face images that are sensed in different domains, such as sketches to photographs (visual spectra images), thermal images to photographs or near-infrared images to photographs. In this work we suggest that high level features of Deep Convolutional Neural Networks trained on visual spectra images are potentially domain independent and can be used to encode faces sensed in different image domains. A generic framework for Heterogeneous Face Recognition is proposed by adapting Deep Convolutional Neural Networks low level features in, so called, “Domain Specific Units”. The adaptation using Domain Specific Units allow the learning of shallow feature detectors specific for each new image domain. Furthermore, it handles its transformation to a generic face space shared between all image domains. Experiments carried out with four different face databases covering three different image domains show substantial improvements, in terms of recognition rate, surpassing the state-of-the-art for most of them. This work is made reproducible: all the source code, scores and trained models of this approach are made publicly available."
}
Tim Laibacher and André Anjos. On the evaluation and real-world usage scenarios of deep vessel segmentation for retinography. September 2019. URL: https://arxiv.org/abs/1909.03856, arXiv:1909.03856, doi:10.48550/arXiv.1909.03856.
Article
@misc{arxiv-2019,
author = "Laibacher, Tim and Anjos, Andr\'e",
title = "On the Evaluation and Real-World Usage Scenarios of Deep Vessel Segmentation for Retinography",
addendum = "(Issued from intership supervision)",
year = "2019",
month = "September",
eprint = "1909.03856",
archivePrefix = "arXiv",
primaryClass = "cs.CV",
doi = "10.48550/arXiv.1909.03856",
url = "https://arxiv.org/abs/1909.03856",
pdf = "https://arxiv.org/pdf/1909.03856",
journaltitle = "{arXiv}:1909.03856 [cs] (submitted to IEEE International Symposium on Biomedical Imaging 2021)",
abstract = "We identify and address three research gaps in the field of vessel segmentation for retinography. The first focuses on the task of inference on high-resolution fundus images for which only a limited set of ground-truth data is publicly available. Notably, we highlight that simple rescaling and padding or cropping of lower resolution datasets is surprisingly effective. We further explore the effectiveness of semi-supervised learning for better domain adaptation in this context. Our results show competitive performance on a set of common public retina datasets, using a small and light-weight neural network. For HRF, the only very high-resolution dataset currently available, we reach comparable, if not superior, state-of-the-art performance by solely relying on training images from lower-resolution datasets. The second topic we address concerns the lack of standardisation in evaluation metrics. We investigate the variability of the F1-score on the existing datasets and report results for recently published architectures. Our evaluation show that most reported results are actually comparable to each other in performance. Finally, we address the issue of reproducibility, by open-sourcing the complete framework used to produce results shown here."
}
Lambert Sonna Momo, Luciano Cerqueira Torres, Sébastien Marcel, André Anjos, Michael Liebling, Adrian Shajkofci, Serge Amoos, Alain Woeffray, Alexandre Sierro, Pierre Roduit, Pierre Ferrez, and Lucas Bonvin. Method and device for biometric vascular recognition and/or identification. Patent WO/2019/150254, August 2019. URL: https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2019150254
@patent{3dfv-patent-2019,
author = "Sonna Momo, Lambert and Cerqueira Torres, Luciano and Marcel, S\'ebastien and Anjos, Andr\'e and Liebling, Michael and Shajkofci, Adrian and Amoos, Serge and Woeffray, Alain and Sierro, Alexandre and Roduit, Pierre and Ferrez, Pierre and Bonvin, Lucas",
title = "Method and Device for Biometric Vascular Recognition and/or Identification",
year = "2019",
month = "August",
day = "8",
number = "WO/2019/150254",
type = "Patent",
filing_num = "PCT/IB2019/050708",
yearfiled = "2019",
monthfiled = "1",
dayfiled = "29",
pat_refs = "P\\&TS SA (AG, LTD.); Av. J.-J. Rousseau 4 P.O. Box 2848 2001 Neuchâtel, CH",
abstract = "The invention concerns a method and a biometric acquisition device for biometric vascular recognition and/or identification. The method comprising a step of capturing a plurality of veins images (116, 117, 118) of supposed subcutaneous veins (21) of a same inspecting portion (20) of a presented entity (2) from various converging orientations (113, 114, 115). The method further comprises a step of determine if said entity is a spoof based on estimated likelihood that said supposed subcutaneous veins within said plurality of veins images (116, 117, 118) are likely projections of solid veins (120).",
url = "https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2019150254"
}
Anjith George, Zohreh Mostaani, David Geissenbuhler, Olegs Nikisins, André Anjos, and Sébastien Marcel. Biometric face presentation attack detection with multi-channel convolutional neural network. IEEE Transactions on Information Forensics and Security, May 2019. doi:10.1109/TIFS.2019.2916652.
Article
@article{tifs-2019-2,
author = "George, Anjith and Mostaani, Zohreh and Geissenbuhler, David and Nikisins, Olegs and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
title = "Biometric Face Presentation Attack Detection with Multi-Channel Convolutional Neural Network",
journal = "IEEE Transactions on Information Forensics and Security",
month = "May",
year = "2019",
doi = "10.1109/TIFS.2019.2916652",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/tifs-2019-2.pdf",
abstract = "Face recognition is a mainstream biometric authentication method. However, vulnerability to presentation attacks (a.k.a spoofing) limits its usability in unsupervised applications. Even though there are many methods available for tackling presentation attacks (PA), most of them fail to detect sophisticated attacks such as silicone masks. As the quality of presentation attack instruments improves over time, achieving reliable PA detection with visual spectra alone remains very challenging. We argue that analysis in multiple channels might help to address this issue. In this context, we propose a multi-channel Convolutional Neural Network based approach for presentation attack detection (PAD). We also introduce the new Wide Multi-Channel presentation Attack (WMCA) database for face PAD which contains a wide variety of 2D and 3D presentation attacks for both impersonation and obfuscation attacks. Data from different channels such as color, depth, near-infrared and thermal are available to advance the research in face PAD. The proposed method was compared with feature-based approaches and found to outperform the baselines achieving an ACER of 0.3\\% on the introduced dataset. The database and the software to reproduce the results are made available publicly."
}
André Anjos, Pedro Tome, and Sébastien Marcel. An introduction to vein presentation attacks and detection. In Sébastien Marcel, Mark Nixon, Julian Fierrez, and Nicholas Evans, editors, Handbook of Biometric Anti-Spoofing, pages 419–438. Springer-Verlag, 2nd edition (in press) edition, January 2019. doi:10.1007/978-3-319-92627-8_18.
@incollection{hopad-2019-3,
author = "Anjos, Andr{\'{e}} and Tome, Pedro and Marcel, S{\'{e}}bastien",
editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Fierrez, Julian and Evans, Nicholas",
title = "An Introduction to Vein Presentation Attacks and Detection",
edition = "2nd edition (in press)",
booktitle = "Handbook of Biometric Anti-Spoofing",
publisher = "Springer-Verlag",
year = "2019",
month = "January",
pages = "419--438",
isbn = "ISBN 978-3-319-92627-8",
doi = "10.1007/978-3-319-92627-8\_18",
abstract = "The domain of presentation attacks (PA), including vulnerability studies and detection (PAD) remains very much unexplored by available scientific literature in biometric vein recognition. Contrary to other modalities that use visual spectral sensors for capturing biometric samples, vein biometrics is typically implemented with near-infrared imaging. The use of invisible light spectra challenges the cre- ation PA instruments, but does not render it impossible. In this chapter, we provide an overview of current landscape for PA manufacturing in possible attack vectors for vein recognition, describe existing public databases and baseline techniques to counter such attacks. The reader will also find material to reproduce experiments and findings for fingervein recognition systems. We provide this material with the hope it will be extended to other vein recognition systems and improved in time."
}
Ivana Chingovska, Amir Mohammadi, André Anjos, and Sébastien Marcel. Evaluation methodologies for biometric presentation attack detection. In Sébastien Marcel, Mark Nixon, Julian Fierrez, and Nicholas Evans, editors, Handbook of Biometric Anti-Spoofing, pages 457–480. Springer-Verlag, 2nd edition (in press) edition, January 2019. doi:10.1007/978-3-319-92627-8_20.
@incollection{hopad-2019-2,
author = "Chingovska, Ivana and Mohammadi, Amir and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Fierrez, Julian and Evans, Nicholas",
title = "Evaluation Methodologies for Biometric Presentation Attack Detection",
addendum = "(Issued from Ph.D co-supervision)",
edition = "2nd edition (in press)",
booktitle = "Handbook of Biometric Anti-Spoofing",
publisher = "Springer-Verlag",
year = "2019",
month = "January",
pages = "457--480",
isbn = "ISBN 978-3-319-92627-8",
doi = "10.1007/978-3-319-92627-8\_20",
abstract = "Presentation attack detection (PAD, also known as anti-spoofing) systems, regardless of the technique, biometric mode or degree of independence of external equipment, are most commonly treated as binary classification systems. The two classes that they differentiate are bona-fide and presentation attack samples. From this perspective, their evaluation is equivalent to the established evaluation standards for the binary classification systems. However, PAD systems are designed to operate in conjunction with recognition systems and as such can affect their performance. From the point of view of a recognition system, the presentation attacks are a separate class that they need to be detected and rejected. As the problem of presentation attack detection grows to this pseudo-ternary status, the evaluation methodologies for the recognition systems need to be revised and updated. Consequentially, the database requirements for presentation attack databases become more specific. The focus of this chapter is the task of biometric verification and its scope is three-fold: firstly, it gives the definition of the presentation attack detection problem from the two perspectives. Secondly, it states the database requirements for a fair and unbiased evaluation. Finally, it gives an overview of the existing evaluation techniques for presentation attacks detection systems and verification systems under presentation attacks."
}
Sushil Bhattacharjee, Amir Mohammadi, André Anjos, and Sébastien Marcel. Recent advances in face presentation attack detection. In Sébastien Marcel, Mark Nixon, Julian Fierrez, and Nicholas Evans, editors, Handbook of Biometric Anti-Spoofing, pages 207–228. Springer-Verlag, 2nd edition (in press) edition, January 2019. doi:10.1007/978-3-319-92627-8_10.
@incollection{hopad-2019,
author = "Bhattacharjee, Sushil and Mohammadi, Amir and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Fierrez, Julian and Evans, Nicholas",
title = "Recent Advances in Face Presentation Attack Detection",
edition = "2nd edition (in press)",
booktitle = "Handbook of Biometric Anti-Spoofing",
publisher = "Springer-Verlag",
year = "2019",
month = "January",
pages = "207--228",
isbn = "ISBN 978-3-319-92627-8",
doi = "10.1007/978-3-319-92627-8\_10",
abstract = "The undeniable convenience of face-recognition (FR) based biomet- rics has made it an attractive tool for access control in various applications, from immigration-control to remote banking. Widespread adoption of face biometrics, however, depends on the how secure such systems are perceived to be. One particular vulnerability of FR systems comes from presentation attacks (PA), where a subject A attempts to impersonate another subject B, by presenting, for example, a photograph of B to the biometric sensor (i.e., the camera). PAs are the most likely forms of attacks on face biometric systems, as the camera is the only component of the biometric system that is exposed to the outside world. Robust presentation attack detection (PAD) methods are necessary to construct secure FR based access control systems. The first edition of the Handbook of Biometric Anti-spoofing included two chapters on face-PAD. In this chapter we present the significant advances in face-PAD research since the publication of the first edition of this book. In addition to PAD methods designed to work with color images, we also discuss advances in face-PAD methods using other imaging modalities, namely, near-infrared (NIR) and thermal imaging. This chapter also presents a number of recently published datasets for face-PAD experiments."
}
2018
Olegs Nikisins, Teodors Eglitis, André Anjos, and Sébastien Marcel. Fast cross-correlation based wrist vein recognition algorithm with rotation and translation compensation. In Sixth International Workshop on Biometrics and Forensics. June 2018. URL: https://publications.idiap.ch/index.php/publications/show/3835, doi:10.1109/IWBF.2018.8401550.
Article
@inproceedings{iwbf-2018,
author = "Nikisins, Olegs and Eglitis, Teodors and Anjos, André and Marcel, Sébastien",
month = "June",
title = "Fast cross-correlation based wrist vein recognition algorithm with rotation and translation compensation",
booktitle = "Sixth International Workshop on Biometrics and Forensics",
year = "2018",
url = "https://publications.idiap.ch/index.php/publications/show/3835",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/iwbf-2018.pdf",
doi = "10.1109/IWBF.2018.8401550",
abstract = "Most of the research on vein biometrics addresses the problems of either palm or finger vein recognition with a considerably smaller emphasis on wrist vein modality. This paper paves the way to a better understanding of capabilities and challenges in the field of wrist vein verification. This is achieved by introducing and discussing a fully automatic cross-correlation based wrist vein verification technique. Overcoming the limitations of ordinary cross-correlation, the proposed system is capable of compensating for scale, translation and rotation between vein patterns in a computationally efficient way. Introduced comparison algorithm requires only two cross-correlation operations to compensate for both translation and rotation, moreover the well known property of log-polar transformation of Fourier magnitudes is not involved in any form. To emphasize the veins, a two-layer Hessian-based vein enhancement approach with adaptive brightness normalization is introduced, improving the connectivity and the stability of extracted vein patterns. The experiments on the publicly available PUT Vein wrist database give promising results with FNMR of 3.75\\% for FMR of 0.1\\%. In addition we make this research reproducible providing the source code and instructions to replicate all findings in this work."
}
Marcel Sébastien, André Anjos, and Philip Abbet. Method and internet-connected server for reviewing a computer-executable experiment. Patent US9973503B2, May 2018. URL: https://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&d=PALL&RefSrch=yes&Query=PN/9973503
@patent{beat-patent-2018,
author = "Sébastien, Marcel and Anjos, André and Abbet, Philip",
title = "Method and internet-connected server for reviewing a computer-executable experiment",
year = "2018",
month = "May",
day = "15",
number = "US9973503B2",
type = "Patent",
location = "US",
filing_num = "14/970,333",
yearfiled = "2015",
monthfiled = "12",
dayfiled = "15",
pat_refs = "P\\&TS SA (AG, LTD.); Av. J.-J. Rousseau 4 P.O. Box 2848 2001 Neuchâtel, CH",
abstract = "An internet-connected server comprising a first module for authorizing a user to access the server for: setting up, on the server, a given configuration for conducting a computer-executable experiment, wherein the given configuration comprises at least an executable instruction and a parameter or input data; executing, on the server, the computer-executable experiment with the given configuration so to produce a numerical result; certifying, on the server, the numerical result so to produce a certified result; and generating, on the server, a certification identifier of the certified result. The internet-connected server further comprises a second module for authorizing a reviewer for: providing the server with the certification identifier; and requesting and/or accessing, on the server, the certified numerical result on the basis of the provided certification identifier.",
url = "https://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1\&Sect2=HITOFF\&p=1\&u=/netahtml/PTO/search-bool.html\&r=1\&f=G\&l=50\&d=PALL\&RefSrch=yes\&Query=PN/9973503"
}
Olegs Nikisins, Amir Mohammadi, André Anjos, and Sébastien Marcel. On effectiveness of anomaly detection approaches against unseen presentation attacks in face anti-spoofing. In The 11th IAPR International Conference on Biometrics (ICB 2018). February 2018. URL: https://publications.idiap.ch/index.php/publications/show/3793, doi:10.1109/ICB2018.2018.00022.
Article
@inproceedings{icb-2018,
author = "Nikisins, Olegs and Mohammadi, Amir and Anjos, André and Marcel, Sébastien",
month = "February",
title = "On Effectiveness of Anomaly Detection Approaches against Unseen Presentation Attacks in Face Anti-Spoofing",
booktitle = "The 11th IAPR International Conference on Biometrics (ICB 2018)",
year = "2018",
url = "https://publications.idiap.ch/index.php/publications/show/3793",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icb-2018.pdf",
doi = "10.1109/ICB2018.2018.00022",
abstract = "While face recognition systems got a significant boost in terms of recognition performance in recent years, they are known to be vulnerable to presentation attacks. Up to date, most of the research in the field of face anti-spoofing or presentation attack detection was considered as a two-class classification task: features of bona-fide samples versus features coming from spoofing attempts. The main focus has been on boosting the anti-spoofing performance for databases with identical types of attacks across both training and evaluation subsets. However, in realistic applications the types of attacks are likely to be unknown, potentially occupying a broad space in the feature domain. Therefore, a failure to generalize on unseen types of attacks is one of the main potential challenges in existing anti-spoofing approaches. First, to demonstrate the generalization issues of two-class anti-spoofing systems we establish new evaluation protocols for existing publicly available databases. Second, to unite the data collection efforts of various institutions we introduce a challenging Aggregated database composed of 3 publicly available datasets: Replay-Attack, Replay-Mobile and MSU MFSD, reporting the performance on it. Third, considering existing limitations we propose a number of systems approaching a task of presentation attack detection as an anomaly detection, or a one-class classification problem, using only bona-fide features in the training stage. Using less training data, hence requiring less effort in the data collection, the introduced approach demonstrates a better generalization properties against previously unseen types of attacks on the proposed Aggregated database."
}
2017
André Anjos and Sébastien Marcel. A data-network connected server, a device, a platform and a method for conducting computer-executable experiments. Patent WO/2017/221049, December 2017. URL: https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2017221049
@patent{beat-patent-2017,
author = "Anjos, André and Marcel, Sébastien",
title = "A data-network connected server, a device, a platform and a method for conducting computer-executable experiments",
year = "2017",
month = "December",
day = "28",
number = "WO/2017/221049",
type = "Patent",
location = "CH",
filing_num = "PCT/IB2016/053683",
yearfiled = "2016",
monthfiled = "6",
dayfiled = "21",
pat_refs = "P\\&TS SA (AG, LTD.); Av. J.-J. Rousseau 4 P.O. Box 2848 2001 Neuchâtel, CH",
abstract = "The invention concerns a platform (1), a server (10, 10') and a client device (20) for conducting computer-executable experiments. The server comprises a restricted-access memory module (11,11') for locally storing a data structure with numerical values whose access is restricted to authorized devices and/or users. The server is provided with an instruction receiving module (12,12') for receiving a list of executable instructions for conducting a computer-executable experiment based on numerical values with restricted access from the client device being not authorized to accessing numerical values with restricted access. The server comprises an execution module (13,13') for conducting the experiment so to produce a numerical result; and a communication module (12,12') for transmitting the result to the client device and/or to the user of the client device.",
url = "https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2017221049"
}
Guillaume Heusch, André Anjos, and Sébastien Marcel. A reproducible study on remote heart rate measurement. September 2017. URL: https://arxiv.org/abs/1709.00962, arXiv:1709.00962, doi:10.48550/arXiv.1709.00962.
@misc{arxiv-2017-2,
author = "Heusch, Guillaume and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
title = "A reproducible study on remote heart rate measurement",
journal = "arXiv",
year = "2017",
month = "September",
archivePrefix = "arXiv",
eprint = "1709.00962",
primaryClass = "cs-se",
addendum = "(Issued from project co-supervision)",
url = "https://arxiv.org/abs/1709.00962",
doi = "10.48550/arXiv.1709.00962",
abstract = "This paper studies the problem of reproducible research in remote photoplethysmography (rPPG). Most of the work published in this domain is assessed on privately-owned databases, making it difficult to evaluate proposed algorithms in a standard and principled manner. As a consequence, we present a new, publicly available database containing a relatively large number of subjects recorded under two different lighting conditions. Also, three state-of-the-art rPPG algorithms from the literature were selected, implemented and released as open source free software. After a thorough, unbiased experimental evaluation in various settings, it is shown that none of the selected algorithms is precise enough to be used in a real-world scenario."
}
Milos Cernak, Alain Komaty, Amir Mohammadi, André Anjos, and Sébastien Marcel. Bob speaks kaldi. In Proceedings of Interspeech. August 2017. URL: https://publications.idiap.ch/index.php/publications/show/3623.
Article
@inproceedings{interspeech-2017,
author = "Cernak, Milos and Komaty, Alain and Mohammadi, Amir and Anjos, André and Marcel, Sébastien",
month = "August",
title = "Bob Speaks Kaldi",
booktitle = "Proceedings of Interspeech",
year = "2017",
url = "https://publications.idiap.ch/index.php/publications/show/3623",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/interspeech-2017.pdf",
abstract = "This paper introduces and demonstrates Kaldi integration into Bob signal-processing and machine learning toolbox. The motivation for this integration is two-fold. Firstly, Bob benefits from using advanced speech processing tools developed in Kaldi. Secondly, Kaldi benefits from using complementary Bob modules, such as modulation-based VAD with an adaptive thresholding. In addition, Bob is designed as an open science tool, and this integration might offer to the Kaldi speech community a framework for better reproducibility of state-of-the-art research results."
}
André Anjos, Laurent El Shafey, and Sébastien Marcel. Beat: an open-science web platform. In Thirty-fourth International Conference on Machine Learning. August 2017. URL: https://publications.idiap.ch/index.php/publications/show/3665.
Article | Poster
@inproceedings{icml-2017-1,
author = "Anjos, André and El Shafey, Laurent and Marcel, Sébastien",
month = "August",
title = "BEAT: An Open-Science Web Platform",
booktitle = "Thirty-fourth International Conference on Machine Learning",
year = "2017",
location = "Sydney, Australia",
url = "https://publications.idiap.ch/index.php/publications/show/3665",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icml-2017-1.pdf",
poster = "https://www.idiap.ch/\textasciitilde aanjos/posters/icml-2017-1.pdf",
abstract = "With the increased interest in computational sciences, machine learning (ML), pattern recognition (PR) and big data, governmental agencies, academia and manufacturers are overwhelmed by the constant influx of new algorithms and techniques promising improved performance, generalization and robustness. Sadly, result reproducibility is often an overlooked feature accompanying original research publications, competitions and benchmark evaluations. The main reasons behind such a gap arise from natural complications in research and development in this area: the distribution of data may be a sensitive issue; software frameworks are difficult to install and maintain; Test protocols may involve a potentially large set of intricate steps which are difficult to handle. To bridge this gap, we built an open platform for research in computational sciences related to pattern recognition and machine learning, to help on the development, reproducibility and certification of results obtained in the field. By making use of such a system, academic, governmental or industrial organizations enable users to easily and socially develop processing toolchains, re-use data, algorithms, workflows and compare results from distinct algorithms and/or parameterizations with minimal effort. This article presents such a platform and discusses some of its key features, uses and limitations. We overview a currently operational prototype and provide design insights."
}
André Anjos, Manuel Günther, Tiago de Freitas Pereira, Pavel Korshunov, Amir Mohammadi, and Sébastien Marcel. Continuously reproducing toolchains in pattern recognition and machine learning experiments. In Thirty-fourth International Conference on Machine Learning. August 2017. URL: https://publications.idiap.ch/index.php/publications/show/3666.
Article | Poster
@inproceedings{icml-2017-2,
author = "Anjos, André and Günther, Manuel and de Freitas Pereira, Tiago and Korshunov, Pavel and Mohammadi, Amir and Marcel, Sébastien",
month = "August",
title = "Continuously Reproducing Toolchains in Pattern Recognition and Machine Learning Experiments",
booktitle = "Thirty-fourth International Conference on Machine Learning",
year = "2017",
location = "Sidney, Australia",
url = "https://publications.idiap.ch/index.php/publications/show/3666",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icml-2017-2.pdf",
poster = "https://www.idiap.ch/\textasciitilde aanjos/posters/icml-2017-2.pdf",
abstract = "Pattern recognition and machine learning research work often contains experimental results on real-world data, which corroborates hypotheses and provides a canvas for the development and comparison of new ideas. Results, in this context, are typically summarized as a set of tables and figures, allowing the comparison of various methods, highlighting the advantages of the proposed ideas. Unfortunately, result reproducibility is often an overlooked feature of original research publications, competitions, or benchmark evaluations. The main reason for such a gap is the complexity on the development of software associated with these reports. Software frameworks are difficult to install, maintain, and distribute, while scientific experiments often consist of many steps and parameters that are difficult to report. The increasingly rising complexity of research challenges make it even more difficult to reproduce experiments and results. In this paper, we emphasize that a reproducible research work should be repeatable, shareable, extensible, and stable, and discuss important lessons we learned in creating, distributing, and maintaining software and data for reproducible research in pattern recognition and machine learning. We focus on a specific use-case of face recognition and describe in details how we can make the recognition experiments reproducible in practice."
}
André Anjos, Laurent El-Shafey, and Sébastien Marcel. Beat: an open-source web-based open-science platform. April 2017. URL: https://arxiv.org/abs/1704.02319, arXiv:1704.02319, doi:10.48550/arXiv.1704.02319.
@misc{arxiv-2017,
author = "Anjos, André and El-Shafey, Laurent and Marcel, Sébastien",
title = "BEAT: An Open-Source Web-Based Open-Science Platform",
year = "2017",
month = "April",
archivePrefix = "arXiv",
eprint = "1704.02319",
primaryClass = "cs-se",
doi = "10.48550/arXiv.1704.02319",
url = "https://arxiv.org/abs/1704.02319",
abstract = "With the increased interest in computational sciences, machine learning (ML), pattern recognition (PR) and big data, governmental agencies, academia and manufacturers are overwhelmed by the constant influx of new algorithms and techniques promising improved performance, generalization and robustness. Sadly, result reproducibility is often an overlooked feature accompanying original research publications, competitions and benchmark evaluations. The main reasons behind such a gap arise from natural complications in research and development in this area: the distribution of data may be a sensitive issue; software frameworks are difficult to install and maintain; Test protocols may involve a potentially large set of intricate steps which are difficult to handle. Given the raising complexity of research challenges and the constant increase in data volume, the conditions for achieving reproducible research in the domain are also increasingly difficult to meet. To bridge this gap, we built an open platform for research in computational sciences related to pattern recognition and machine learning, to help on the development, reproducibility and certification of results obtained in the field. By making use of such a system, academic, governmental or industrial organizations enable users to easily and socially develop processing toolchains, re-use data, algorithms, workflows and compare results from distinct algorithms and/or parameterizations with minimal effort. This article presents such a platform and discusses some of its key features, uses and limitations. We overview a currently operational prototype and provide design insights."
}
2016
Aythami Morales, Julian Fierrez, Ruben Tolosana, Javier Ortega-Garcia, Javier Galbally, Marta Gomez-Barrero, André Anjos, and Sébastien Marcel. Keystroke biometrics ongoing competition. IEEE Access, 4:7736–7746, November 2016. doi:10.1109/ACCESS.2016.2626718.
Article
@article{ieee-access-2016,
author = "Morales, Aythami and Fierrez, Julian and Tolosana, Ruben and Ortega-Garcia, Javier and Galbally, Javier and Gomez-Barrero, Marta and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
month = "November",
title = "Keystroke Biometrics Ongoing Competition",
journal = "IEEE Access",
volume = "4",
year = "2016",
pages = "7736--7746",
issn = "2169-3536",
doi = "10.1109/ACCESS.2016.2626718",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-access-2016.pdf",
abstract = "This paper presents the first Keystroke Biometrics Ongoing Competition (KBOC) organized to establish a reproducible baseline in person authentication using keystroke biometrics. The competition has been developed using the BEAT platform and includes one of the largest keystroke databases publicly available based on a fixed text scenario. The database includes genuine and attacker keystroke sequences from 300 users acquired in 4 different sessions distributed in a four month time span. The sequences correspond to the user's name and surname and therefore each user comprises an individual and personal sequence. As baseline for KBOC we report the results of 31 different algorithms evaluated according to performance and robustness. The systems have achieved EERs as low as 5.32\\% and high robustness against multisession variability with drop of performances lower than 1\\% for probes separated by months. The entire database is publicly available at the competition website."
}
Ivana Chingovska, Nesli Erdogmus, André Anjos, and Sébastien Marcel. Face recognition systems under spoofing attacks. In Face Recognition Systems Under Spoofing Attacks, chapter 8, pages 165–194. Springer International Publishing, 1st edition edition, February 2016. doi:10.1007/978-3-319-28501-6_8.
@incollection{face-spoof-2016,
author = "Chingovska, Ivana and Erdogmus, Nesli and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
month = "February",
title = "Face Recognition Systems Under Spoofing Attacks",
booktitle = "Face Recognition Systems Under Spoofing Attacks",
edition = "1st edition",
chapter = "8",
year = "2016",
pages = "165--194",
publisher = "Springer International Publishing",
doi = "10.1007/978-3-319-28501-6\_8",
addendum = "(Issued from Ph.D co-supervision)",
abstract = "In this chapter, we give an overview of spoofing attacks and spoofing countermeasures for face recognition systems , with a focus on visual spectrum systems (VIS) in 2D and 3D, as well as near-infrared (NIR) and multispectral systems . We cover the existing types of spoofing attacks and report on their success to bypass several state-of-the-art face recognition systems. The results on two different face spoofing databases in VIS and one newly developed face spoofing database in NIR show that spoofing attacks present a significant security risk for face recognition systems in any part of the spectrum. The risk is partially reduced when using multispectral systems. We also give a systematic overview of the existing anti-spoofing techniques, with an analysis of their advantages and limitations and prospective for future work."
}
2015
Ivana Chingovska and André Anjos. On the use of client identity information for face anti-spoofing. IEEE Transactions on Information Forensics and Security, Special Issue on Biometric Anti-spoofing, 10(4):787–796, February 2015. doi:10.1109/TIFS.2015.2400392.
Article
@article{tifs-2015,
author = "Chingovska, Ivana and Anjos, Andr{\'{e}}",
keywords = "Biometric Verification, Counter-Measures, Counter-Spoofing, Liveness Detection, Replay, Spoofing Attacks",
title = "On the use of client identity information for face anti-spoofing",
journal = "IEEE Transactions on Information Forensics and Security, Special Issue on Biometric Anti-spoofing",
addendum = "(Issued from Ph.D co-supervision)",
volume = "10",
number = "4",
month = "February",
year = "2015",
pages = "787--796",
doi = "10.1109/TIFS.2015.2400392",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/tifs-2015.pdf",
abstract = "With biometrics playing the role of a password which can not be replaced if stolen, the necessity of establishing counter-measures to biometric spoofing attacks has been recognized. Regardless of the biometric mode, the typical approach of anti-spoofing systems is to classify biometric evidence based on features discriminating between real accesses and spoofing attacks. For the first time, to the best of our knowledge, this paper studies the amount of client-specific information within these features and how it affects the performance of anti-spoofing systems. We make use of this information to build two client-specific anti-spoofing solutions, one relying on a generative and another one on a discriminative paradigm. The proposed methods, tested on a set of state-of-the-art anti-spoofing features for the face mode, outperform the client-independent approaches with up to 50\\% relative improvement and exhibit better generalization capabilities on unseen types of spoofing attacks."
}
2014
Ivana Chingovska, André Anjos, and Sébastien Marcel. Biometrics evaluation under spoofing attacks. IEEE Transactions on Information, Forensics and Security, August 2014. doi:10.1109/TIFS.2014.2349158.
Article
@article{tifs-2014,
author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
title = "Biometrics Evaluation Under Spoofing Attacks",
journal = "IEEE Transactions on Information, Forensics and Security",
year = "2014",
month = "August",
volume = "9",
number = "12",
doi = "10.1109/TIFS.2014.2349158",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/tifs-2014.pdf",
addendum = "(Issued from Ph.D co-supervision)",
abstract = "While more accurate and reliable than ever, the trustworthiness of biometric verification systems is compromised by the emergence of spoofing attacks. Responding to this threat, numerous research publications address isolated spoofing detection, resulting in efficient counter-measures for many biometric modes. However, an important, but often overlooked issue regards their engagement into a verification task and how to measure their impact on the verification systems themselves. A novel evaluation framework for verification systems under spoofing attacks, called Expected Performance and Spoofability (EPS) framework, is the major contribution of this paper. Its purpose is to serve for an objective comparison of different verification systems with regards to their verification performance and vulnerability to spoofing, taking into account the system’s application-dependent susceptibility to spoofing attacks and cost of the errors. The convenience of the proposed open-source framework is demonstrated for the face mode, by comparing the security guarantee of four baseline face verification systems before and after they are secured with anti-spoofing algorithms."
}
Tiago de Freitas Pereira, Jukka Komulainen, André Anjos, José Mario De Martino, Abdenour Hadid, Matti Pietikainen, and Sébastien Marcel. Face liveness detection using dynamic texture. EURASIP Journal on Image and Video Processing, January 2014. doi:10.1186/1687-5281-2014-2.
Article
@article{eurasip-2014,
author = "de Freitas Pereira, Tiago and Komulainen, Jukka and Anjos, André and De Martino, José Mario and Hadid, Abdenour and Pietikainen, Matti and Marcel, Sébastien",
title = "Face liveness detection using dynamic texture",
journal = "EURASIP Journal on Image and Video Processing",
year = "2014",
month = "January",
doi = "10.1186/1687-5281-2014-2",
volume = "2014:2",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/eurasip-2014.pdf",
abstract = "User authentication is an important step to protect information, and in this context, face biometrics is potentially advantageous. Face biometrics is natural, intuitive, easy to use, and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using cheap low-tech equipment. This paper introduces a novel and appealing approach to detect face spoofing using the spatiotemporal (dynamic texture) extensions of the highly popular local binary pattern operator. The key idea of the approach is to learn and detect the structure and the dynamics of the facial micro-textures that characterise real faces but not fake ones. We evaluated the approach with two publicly available databases (Replay-Attack Database and CASIA Face Anti-Spoofing Database). The results show that our approach performs better than state-of-the-art techniques following the provided evaluation protocols of each database."
}
Stan Z.Li, Javier Galbally, André Anjos, and Sébastien Marcel. Evaluation databases. In Sébastien Marcel, Mark Nixon, and Stan Z.Li, editors, Handbook of Biometric Anti-Spoofing, chapter Appendix A, pages 247–278. Springer-Verlag, 2014. doi:10.1007/978-1-4471-6524-8.
@incollection{hopad-2014-3,
author = "Z.Li, Stan and Galbally, Javier and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Z.Li, Stan",
title = "Evaluation Databases",
booktitle = "Handbook of Biometric Anti-Spoofing",
chapter = "Appendix A",
year = "2014",
pages = "247--278",
publisher = "Springer-Verlag",
isbn = "978-1-4471-6523-1",
doi = "10.1007/978-1-4471-6524-8"
}
Ivana Chingovska, André Anjos, and Sébastien Marcel. Evaluation methodologies. In Sébastien Marcel, Mark Nixon, and Stan Z.Li, editors, Handbook of Biometric Anti-Spoofing, chapter 10, pages 185–204. Springer-Verlag, 2014. doi:10.1007/978-1-4471-6524-8_10.
@incollection{hopad-2014-2,
author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Z.Li, Stan",
title = "Evaluation Methodologies",
chapter = "10",
pages = "185--204",
booktitle = "Handbook of Biometric Anti-Spoofing",
publisher = "Springer-Verlag",
year = "2014",
doi = "10.1007/978-1-4471-6524-8\_10",
addendum = "(Issued from Ph.D co-supervision)",
abstract = "Following the definition of the task of the anti-spoofing systems to discriminate between real accesses and spoofing attacks, anti-spoofing can be regarded as a binary classification problem. The spoofing databases and the evaluation methodologies for anti-spoofing systems most often comply to the standards for binary classification problems. However the anti-spoofing systems are not destined to work stand-alone, and their main purpose is to protect a verification system from spoofing attacks. In the process of combining the decision of an anti-spoofing and a recognition system, effects on the recognition performance can be expected. Therefore, it is important to analyze the problem of anti-spoofing under the umbrella of biometric recognition systems. This brings certain requirements in the database design, as well as adapted concepts for evaluation of biometric recognition systems under spoofing attacks."
}
André Anjos, Jukka Komulainen, Sébastien Marcel, Abdenour Hadid, and Matti Pietikainen. Face anti-spoofing: visual approach. In Sébastien Marcel, Mark Nixon, and Stan Z.Li, editors, Handbook of Biometric Anti-Spoofing, chapter 4, pages 65–82. Springer-Verlag, 2014. doi:10.1007/978-1-4471-6524-8_4.
@incollection{hopad-2014,
author = "Anjos, André and Komulainen, Jukka and Marcel, Sébastien and Hadid, Abdenour and Pietikainen, Matti",
editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Z.Li, Stan",
title = "Face Anti-Spoofing: Visual Approach",
chapter = "4",
booktitle = "Handbook of Biometric Anti-Spoofing",
publisher = "Springer-Verlag",
pages = "65--82",
year = "2014",
doi = "10.1007/978-1-4471-6524-8\_4",
abstract = "User authentication is an important step to protect information and in this regard face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work revealed that face biometrics is quite vulnerable to spoofing attacks. This chapter presents the different modalities of attacks to visual spectrum face recognition systems. We introduce public datasets for the evaluation of vulnerability of recognition systems and performance of counter-measures. Finally, we build a comprehensive view of anti-spoofing techniques for visual spectrum face recognition and provide an outlook of issues that remain unaddressed."
}
André Anjos, Ivana Chingovska, and Sébastien Marcel. Anti-spoofing: face databases. In Stan Z.Li and Anil Jain, editors, Encyclopedia of Biometrics. Springer US, 2nd edition edition, 2014. doi:10.1007/978-3-642-27733-7_9212-2.
@incollection{eob-2014-2,
author = "Anjos, Andr{\'{e}} and Chingovska, Ivana and Marcel, S{\'{e}}bastien",
editor = "Z.Li, Stan and Jain, Anil",
title = "Anti-Spoofing: Face Databases",
booktitle = "Encyclopedia of Biometrics",
edition = "2nd edition",
year = "2014",
publisher = "Springer US",
isbn = "978-3-642-27733-7",
doi = "10.1007/978-3-642-27733-7\_9212-2",
addendum = "(Issued from Ph.D co-supervision)",
abstract = "Datasets for the evaluation of face verification system vulnerabilities to spoofing attacks and for the evaluation of face spoofing countermeasures."
}
Ivana Chingovska, André Anjos, and Sébastien Marcel. Anti-spoofing: evaluation methodologies. In Stan Z.Li and Anil Jain, editors, Encyclopedia of Biometrics. Springer US, 2nd edition edition, 2014. doi:10.1007/978-3-642-27733-7.
@incollection{eob-2014,
author = "Chingovska, Ivana and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
editor = "Z.Li, Stan and Jain, Anil",
title = "Anti-spoofing: Evaluation Methodologies",
booktitle = "Encyclopedia of Biometrics",
edition = "2nd edition",
year = "2014",
publisher = "Springer US",
isbn = "978-3-642-27733-7",
doi = "10.1007/978-3-642-27733-7",
addendum = "(Issued from Ph.D co-supervision)",
abstract = "Following the definition of the task of the anti-spoofing systems to discriminate between real accesses and spoofing attacks, anti-spoofing can be regarded as a binary classification problem. The spoofing databases and the evaluation methodologies for anti-spoofing systems most often comply to the standards for binary classification problems. However, the anti-spoofing systems are not destined to work stand-alone, and their main purpose is to protect a verification system from spoofing attacks. In the process of combining the decision of an anti-spoofing and a recognition system, effects on the recognition performance can be expected. Therefore, it is important to analyze the problem of anti-spoofing under the umbrella of biometric recognition systems. This brings certain requirements in the database design, as well as adapted concepts for evaluation of biometric recognition systems under spoofing attacks."
}
2013
André Anjos, Murali Mohan Chakka, and Sébastien Marcel. Motion-based counter-measures to photo attacks in face recognition. IET Biometrics, July 2013. doi:10.1049/iet-bmt.2012.0071.
Article
@article{iet-biometrics-2013,
author = "Anjos, André and Chakka, Murali Mohan and Marcel, Sébastien",
title = "Motion-Based Counter-Measures to Photo Attacks in Face Recognition",
journal = "IET Biometrics",
year = "2013",
month = "July",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/iet-biometrics-2013.pdf",
doi = "10.1049/iet-bmt.2012.0071",
abstract = "Identity spoofing is a contender for high-security face recognition applications. With the advent of social media and globalized search, our face images and videos are wide-spread on the internet and can be potentially used to attack biometric systems without previous user consent. Yet, research to counter these threats is just on its infancy - we lack public standard databases, protocols to measure spoofing vulnerability and baseline methods to detect these attacks. The contributions of this work to the area are three-fold: firstly we introduce a publicly available PHOTO-ATTACK database with associated protocols to measure the effectiveness of counter-measures. Based on the data available, we conduct a study on current state-of-the-art spoofing detection algorithms based on motion analysis, showing they fail under the light of these new dataset. By last, we propose a new technique of counter-measure solely based on foreground/background motion correlation using Optical Flow that outperforms all other algorithms achieving nearly perfect scoring with an equal-error rate of 1.52\\% on the available test data. The source code leading to the reported results is made available for the replicability of findings in this article."
}
I. Chingovska, J. Yang, Z. Lei, D. Yi, S. Z. Li, O. Kähm, C. Glaser, N. Damer, A. Kuijper, A. Nouak, J. Komulainen, T. Pereira, S. Gupta, S. Khandelwal, S. Bansal, A. Rai, T. Krishna, D. Goyal, M.-A. Waris, H. Zhang, I. Ahmad, S. Kiranyaz, M. Gabbouj, R. Tronci, M. Pili, N. Sirena, F. Roli, J. Galbally, J. Fierrez, A. Pinto, H. Pedrini, W. S. Schwartz, A. Rocha, A. Anjos, and S. Marcel. The 2nd competition on counter measures to 2d face spoofing attacks. In International Conference on Biometrics 2013. June 2013. doi:10.1109/ICB.2013.6613026.
Article
@inproceedings{icb-2013-3,
author = "Chingovska, I. and Yang, J. and Lei, Z. and Yi, D. and Li, S. Z. and Kähm, O. and Glaser, C. and Damer, N. and Kuijper, A. and Nouak, A. and Komulainen, J. and Pereira, T. and Gupta, S. and Khandelwal, S. and Bansal, S. and Rai, A. and Krishna, T. and Goyal, D. and Waris, M.-A. and Zhang, H. and Ahmad, I. and Kiranyaz, S. and Gabbouj, M. and Tronci, R. and Pili, M. and Sirena, N. and Roli, F. and Galbally, J. and Fierrez, J. and Pinto, A. and Pedrini, H. and Schwartz, W. S. and Rocha, A. and Anjos, A. and Marcel, S.",
title = "The 2nd Competition on Counter Measures to 2D Face Spoofing Attacks",
booktitle = "International Conference on Biometrics 2013",
month = "June",
year = "2013",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icb-2013-3.pdf",
doi = "10.1109/ICB.2013.6613026",
abstract = "As a crucial security problem, anti-spoofing in biometrics, and particularly for the face modality, has achieved great progress in the recent years. Still, new threats arrive in form of better, more realistic and more sophisticated spoofing attacks. The objective of the 2nd Competition on Counter Measures to 2D Face Spoofing Attacks is to challenge researchers to create counter measures effectively detecting a variety of attacks. The submitted propositions are evaluated on the Replay-Attack database and the achieved results are presented in this paper."
}
Jukka Komulainen, Abdenour Hadid, Matti Pietikäinen, André Anjos, and Sébastien Marcel. Complementary countermeasures for detecting scenic face spoofing attacks. In International Conference on Biometrics 2013. June 2013. doi:10.1109/ICB.2013.6612968.
Article
@inproceedings{icb-2013-2,
author = "Komulainen, Jukka and Hadid, Abdenour and Pietikäinen, Matti and Anjos, André and Marcel, Sébastien",
title = "Complementary Countermeasures for Detecting Scenic Face Spoofing Attacks",
booktitle = "International Conference on Biometrics 2013",
month = "June",
year = "2013",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icb-2013-2.pdf",
doi = "10.1109/ICB.2013.6612968",
abstract = "The face recognition community has finally started paying more attention to the long-neglected problem of spoofing attacks. The number of countermeasures is gradually increasing and fairly good results have been reported on the publicly available databases. There exists no superior anti-spoofing technique due to the varying nature of attack scenarios and acquisition conditions. Therefore, it is important to find out complementary countermeasures and study how they should be combined in order to construct an easily extensible anti-spoofing framework. In this paper, we address this issue by studying fusion of motion and texture based countermeasures under several types of scenic face attacks. We provide an intuitive way to explore the fusion potential of different visual cues and show that the performance of the individual methods can be vastly improved by performing fusion at score level. The Half-Total Error Rate (HTER) of the best individual countermeasure was decreased from 11.2\\% to 5.1\\% on the Replay Attack Database. More importantly, we question the idea of using complex classification schemes in individual countermeasures, since nearly same fusion performance is obtained by replacing them with a simple linear one. In this manner, the computational efficiency and also probably the generalization ability of the resulting anti-spoofing framework are increased."
}
Tiago de Freitas Pereira, André Anjos, José Mario De Martino, and Sébastien Marcel. Can face anti-spoofing countermeasures work in a real world scenario? In International Conference on Biometrics 2013. June 2013. doi:10.1109/ICB.2013.6612981.
Article
@inproceedings{icb-2013-1,
author = "de Freitas Pereira, Tiago and Anjos, André and Martino, José Mario De and Marcel, Sébastien",
title = "Can face anti-spoofing countermeasures work in a real world scenario?",
booktitle = "International Conference on Biometrics 2013",
month = "June",
year = "2013",
doi = "10.1109/ICB.2013.6612981",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icb-2013-1.pdf",
abstract = "User authentication is an important step to protect in- formation and in this field face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using low-tech equipments. This article assesses how well existing face anti-spoofing countermeasures can work in a more realistic condition. Experiments carried out with two freely available video databases (Replay Attack Database and CASIA Face Anti-Spoofing Database) show low generalization and possible database bias in the evaluated countermeasures. To generalize and deal with the diversity of attacks in a real world scenario we introduce two strategies that show promising results."
}
Ivana Chingovska, André Anjos, and Sébastien Marcel. Anti-spoofing in action: joint operation with a verification system. In Computer Vision and Pattern Recognition Conference - Biometrics Workshop. June 2013. doi:10.1109/CVPRW.2013.22.
Article
@inproceedings{cvpr-bw-2013,
author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
title = "Anti-spoofing in action: joint operation with a verification system",
booktitle = "Computer Vision and Pattern Recognition Conference - Biometrics Workshop",
year = "2013",
doi = "10.1109/CVPRW.2013.22",
month = "June",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/cvpr-bw-2013.pdf",
abstract = "Besides the recognition task, today’s biometric systems need to cope with additional problem: spoofing attacks. Up to date, academic research considers spoofing as a binary classification problem: systems are trained to discriminate between real accesses and attacks. However, spoofing counter-measures are not designated to operate stand-alone, but as a part of a recognition system they will protect. In this paper, we study techniques for decisionlevel and score-level fusion to integrate a recognition and anti-spoofing systems, using an open-source framework that handles the ternary classification problem (clients, impostors and attacks) transparently. By doing so, we are able to report the impact of different spoofing counter-measures, fusion techniques and thresholding on the overall performance of the final recognition system. For a specific use case covering face verification, experiments show to what extent simple fusion improves the trustworthiness of the system when exposed to spoofing attacks."
}
2012
Ivana Chingovska, André Anjos, and Sébastien Marcel. On the effectiveness of local binary patterns in face anti-spoofing. In IEEE International Conference of the Biometrics Special Interest Group. 2012.
Article
@inproceedings{biosig-2012,
author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
title = "On the Effectiveness of Local Binary Patterns in Face Anti-spoofing",
booktitle = "IEEE International Conference of the Biometrics Special Interest Group",
year = "2012",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/biosig-2012.pdf",
isbn = "978-3-88579-290-1",
abstract = "Spoofing attacks are one of the security traits that biometric recognition systems are proven to be vulnerable to. When spoofed, a biometric recognition system is bypassed by presenting a copy of the biometric evidence of a valid user. Among all biometric modalities, spoofing a face recognition system is particularly easy to perform: all that is needed is a simple photograph of the user. In this paper, we address the problem of detecting face spoofing attacks. In particular, we inspect the potential of texture features based on Local Binary Patterns (LBP) and their variations on three types of attacks: printed photographs, and photos and videos displayed on electronic screens of different sizes. For this purpose, we introduce REPLAY-ATTACK, a novel publicly available face spoofing database which contains all the mentioned types of attacks. We conclude that LBP, with \textasciitilde 15\\% Half Total Error Rate, show moderate discriminability when confronted with a wide set of attack types."
}
André Anjos, Laurent El Shafey, Roy Wallace, Manuel Günther, Chris McCool, and Sébastien Marcel. Bob: a free signal processing and machine learning toolbox for researchers. In ACM Multimedia 2012, 1449–1452. 2012. doi:10.1145/2393347.2396517.
Article
@inproceedings{acmmm-2012,
author = "Anjos, André and Shafey, Laurent El and Wallace, Roy and Günther, Manuel and McCool, Chris and Marcel, Sébastien",
title = "Bob: a free signal processing and machine learning toolbox for researchers",
booktitle = "ACM Multimedia 2012",
year = "2012",
pages = "1449--1452",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/acmmm-2012.pdf",
doi = "10.1145/2393347.2396517",
abstract = "Bob is a free signal processing and machine learning toolbox originally developed by the Biometrics group at Idiap Research Institute, Switzerland. The toolbox is designed to meet the needs of researchers by reducing development time and efficiently processing data. Firstly, Bob provides a researcher-friendly Python environment for rapid development. Secondly, efficient processing of large amounts of multimedia data is provided by fast C++ implementations of identified bottlenecks. The Python environment is integrated seamlessly with the C++ library, which ensures the library is easy to use and extensible. Thirdly, Bob supports reproducible research through its integrated experimental protocols for several databases. Finally, a strong emphasis is placed on code clarity, documentation, and thorough unit testing. Bob is thus an attractive resource for researchers due to this unique combination of ease of use, efficiency, extensibility and transparency. Bob is an open-source library and an ongoing community effort."
}
Tiago de Freitas Pereira, André Anjos, José Mario De Martino, and Sébastien Marcel. Lbp-top based countermeasure against facial spoofing attacks. In International Workshop on Computer Vision With Local Binary Pattern Variants. 2012. doi:10.1007/978-3-642-37410-4_11.
Article
@inproceedings{accv-2012,
author = "de Freitas Pereira, Tiago and Anjos, André and Martino, José Mario De and Marcel, Sébastien",
title = "LBP-TOP based countermeasure against facial spoofing attacks",
booktitle = "International Workshop on Computer Vision With Local Binary Pattern Variants",
year = "2012",
doi = "10.1007/978-3-642-37410-4\_11",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/accv-2012.pdf",
abstract = "User authentication is an important step to protect informa- tion and in this field face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using low-tech cheap equipments. This article presents a countermeasure against such attacks based on the LBP−TOP operator combining both space and time information into a single multiresolution texture descrip- tor. Experiments carried out with the REPLAY ATTACK database show a Half Total Error Rate (HTER) improvement from 15.16\\% to 7.60\\%."
}
2011
Murali Mohan Chakka, André Anjos, Sébastien Marcel, and others. Competition on counter measures to 2-d facial spoofing attacks. In International Joint Conference on Biometrics 2011. October 2011. doi:10.1109/IJCB.2011.6117509.
Article
@inproceedings{ijcb-2011,
author = "Chakka, Murali Mohan and Anjos, André and Marcel, Sébastien and others",
title = "Competition on Counter Measures to 2-D Facial Spoofing Attacks",
booktitle = "International Joint Conference on Biometrics 2011",
year = "2011",
month = "October",
doi = "10.1109/IJCB.2011.6117509",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ijcb-2011.pdf",
abstract = "Spoofing identities using photographs is one of the most common techniques to attack 2-D face recognition systems. There seems to exist no comparative studies of different techniques using the same protocols and data. The motivation behind this competition is to compare the performance of different state-of-the-art algorithms on the same database using a unique evaluation method. Six different teams from universities around the world have participated in the contest. Use of one or multiple techniques from motion, texture analysis and liveness detection appears to be the common trend in this competition. Most of the algorithms are able to clearly separate spoof attempts from real accesses. The results suggest the investigation of more complex attacks."
}
André Anjos and Sébastien Marcel. Counter-measures to photo attacks in face recognition: a public database and a baseline. In International Joint Conference on Biometrics 2011. October 2011. doi:10.1109/IJCB.2011.6117503.
Article
@inproceedings{ijcb-2011-2,
author = "Anjos, André and Marcel, Sébastien",
title = "Counter-Measures to Photo Attacks in Face Recognition: a public database and a baseline",
booktitle = "International Joint Conference on Biometrics 2011",
year = "2011",
month = "October",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ijcb-2011-2.pdf",
doi = "10.1109/IJCB.2011.6117503",
abstract = "A common technique to by-pass 2-D face recognition systems is to use photographs of spoofed identities. Unfortunately, research in counter-measures to this type of attack have not kept-up - even if such threats have been known for nearly a decade, there seems to exist no consensus on best practices, techniques or protocols for developing and testing spoofing-detectors for face recognition. We attribute the reason for this delay, partly, to the unavailability of public databases and protocols to study solutions and compare results. To this purpose we introduce the publicly available PRINT-ATTACK database and exemplify how to use its companion protocol with a motion-based algorithm that detects correlations between the person\'s head movements and the scene context. The results are to be used as basis for comparison to other counter-measure techniques. The PRINT-ATTACK database contains 200 videos of real-accesses and 200 videos of spoof attempts using printed photographs of 50 different identities."
}
2010
The ATLAS Collaboration. Atlas trigger and data acquisition: capabilities and commissioning. In 11th Pisa Meeting on Advanced Detectors, volume 617, 306–309. 2010. doi:10.1016/j.nima.2009.06.114.
Article
@inproceedings{nima-2010,
author = "Collaboration, The ATLAS",
title = "ATLAS Trigger and Data Acquisition: capabilities and commissioning",
year = "2010",
volume = "617",
number = "1",
pages = "306--309",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nima-2010.pdf",
booktitle = "11th Pisa Meeting on Advanced Detectors",
issn = "0168-9002",
doi = "10.1016/j.nima.2009.06.114",
abstract = "The ATLAS trigger system is based on three levels of event selection that selects the physics of interest from an initial bunch crossing rate of 40\textasciitilde MHz to an output rate of sim200\textasciitilde Hz compatible with the offline computing power and storage capacity. During nominal LHC operations at a luminosity of 1034\textasciitilde cm−2s−1, decisions must be taken every 25\textasciitilde ns. The LHC is expected to begin operations with a peak luminosity of 1031\textasciitilde cm−2s−1 with far fewer number of bunches, but quickly ramp up to higher luminosities. Hence, the ATLAS Trigger and Data Acquisition system needs to adapt to the changing beam conditions preserving the interesting physics and detector requirements that may vary with these conditions."
}
2009
R.C. Torres, A. Anjos, and J.M. Seixas. Automatizing the online filter test management for a general-purpose particle detector. Computer Physics Communications, October 2009. doi:10.1016/j.cpc.2010.10.003.
Article
@article{cpc-2009,
author = "Torres, R.C. and Anjos, A. and Seixas, J.M.",
title = "Automatizing the Online Filter Test Management for a General-Purpose Particle Detector",
journal = "Computer Physics Communications",
year = "2009",
month = "October",
doi = "10.1016/j.cpc.2010.10.003",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/cpc-2009.pdf",
abstract = "This paper presents a software environment to automatically configure and run online triggering and dataflow farms for the ATLAS experiment at the Large Hadron Collider (LHC). It provides support for a broad set of users, with distinct knowledge about the online triggering system, ranging from casual testers to final system deployers. This level of automatization improves the overall ATLAS TDAQ work flow for software and hardware tests and speeds-up system modifications and deployment."
}
The ATLAS Collaboration. Atlas trigger status and results from commissioning operations. In Advanced Computing on High-Energy Physics 2008, Erice, Sicily, Italy. 2009.
Article
@inproceedings{acat-2009,
author = "Collaboration, The ATLAS",
title = "ATLAS Trigger Status and Results From Commissioning Operations",
booktitle = "Advanced Computing on High-Energy Physics 2008, Erice, Sicily, Italy",
year = "2009",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/acat-2009.pdf",
abstract = "The ATLAS trigger system is designed to select rare physics processes of interest from an extremely high rate of proton-proton collisions, reducing the LHC incoming rate of about 107. The short LHC bunch crossing period of 25 ns and the large background of soft-scattering events overlapped in each bunch crossing pose serious challenges, both on hardware and software, that the ATLAS trigger must overcome in order to efficiently select interesting events. The ATLAS trigger consists of hardware based Level-1, and a two-level software based High-Level Trigger (HLT). Data bandwidth and processing times in the higher level triggers are reduced by region of interest guidance in the HLT reconstruction steps. High flexibility is critical in order to adapt to the changing luminosity, backgrounds and physics goals. It is achieved by the use of inclusive trigger menus and modular software design. Selection algorithms have been developed which provide the required elasticity to detect different physics signatures and to control the trigger rates. In this paper an overview of the ATLAS trigger design, status and expected performance, as well as the results from the on-going commissioning with cosmic rays and first LHC beams, is presented."
}
The ATLAS Collaboration. Atlas trigger for first physics and beyond. In Physics at LHC 2008 29 September - October 4, 2008 Split, Croatia. 2009.
Article
@inproceedings{lhc-2009,
author = "Collaboration, The ATLAS",
title = "Atlas trigger for first physics and beyond",
booktitle = "Physics at LHC 2008 29 September - October 4, 2008 Split, Croatia",
year = "2009",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/lhc-2009.pdf",
abstract = "ATLAS is a multi-purpose spectrometer built to perform precision measurements of Standard Model parameters and is aiming at discovery of Higgs particle, Super Symmetry and possible other physics channels beyond Standard Model. Operating at 14 TeV center of mass energy ATLAS will see 40 million events per second at nominal luminosity with about 25 overlapping interactions. Most of the events are inelastic proton-proton interactions with only few W, Z bosons or ttbar pairs produced each second, and expectations for Higgs or SUSY production cross-section are much smaller than that. ATLAS trigger has a difficult task to select one out of 10 5 events online and to ensure that most physics channels of interests are preserved for analysis. In this talk we will review the design of ATLAS trigger system, the trigger menu prepared for initial LHC run as well as for high luminosity run. The expected trigger performance of the base-line ATLAS physics programs will be reviewed and first results from the commissioning period will be given. The methods to measure trigger efficiencies and biases directly from data will be discussed."
}
The ATLAS Collaboration. Configuration and control of the atlas trigger and data acquisition. In The 1st international conference on Technology and Instrumentation in Particle Physics. 2009.
Article
@inproceedings{tipp-2009,
author = "Collaboration, The ATLAS",
title = "Configuration and Control of the ATLAS Trigger and Data Acquisition",
booktitle = "The 1st international conference on Technology and Instrumentation in Particle Physics",
year = "2009",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/tipp-2009.pdf",
abstract = "ATLAS is the biggest of the experiments aimed at studying high-energy particle interactions at the Large Hadron Collider (LHC). This paper describes the evolution of the Controls and Configuration system of the ATLAS Trigger and Data Acquisition (TDAQ) from the Technical Design Report (TDR) in 2003 to the first events taken at CERN with circulating beams in autumn 2008. The present functionality and performance and the lessons learned during the development are outlined. At the end we will also highlight some of the challenges which still have to be met by 2010, when the full scale of the trigger farm will be deployed."
}
The ATLAS Collaboration. The atlas online high level trigger framework: experience reusing offline software components in the atlas trigger. In Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009. 2009.
Article
@inproceedings{chep-2009,
author = "Collaboration, The ATLAS",
title = "The ATLAS online High Level Trigger framework: experience reusing offline software components in the ATLAS trigger",
booktitle = "Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009",
year = "2009",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2009.pdf",
abstract = "Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the Atlas computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies."
}
The ATLAS Collaboration. Commissioning of the atlas high level trigger with single beam and cosmic rays. In Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009. 2009.
Article
@inproceedings{chep-2009-2,
author = "Collaboration, The ATLAS",
title = "Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays",
booktitle = "Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009",
year = "2009",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2009-2.pdf",
abstract = "ATLAS is one of the two general-purpose detectors at the Large Hadron Collider (LHC). The trigger system is responsible for making the online selection of interesting collision events. At the LHC design luminosity of 10^34 cm^-2s^-1 it will need to achieve a rejection factor of the order of 10^-7 against random proton-proton interactions, while selecting with high efficiency events that are needed for physics analyses. After a first processing level using custom electronics based on FPGAs and ASICs, the trigger selection is made by software running on two processor farms, containing a total of around two thousand multi-core machines. This system is known as the High Level Trigger (HLT). To reduce the network data traffic and the processing time to manageable levels, the HLT uses seeded, step-wise reconstruction, aiming at the earliest possible rejection of background events. The recent LHC startup and short single-beam run provided a \'stress test\' of the system and some initial calibration data. Following this period, ATLAS continued to collect cosmic-ray events for detector alignment and calibration purposes. After giving an overview of the trigger design and its innovative features, this paper focuses on the experience gained from operating the ATLAS trigger with single LHC beams and cosmic-rays."
}
2008
The ATLAS Collaboration. The atlas experiment at the cern large hadron collider. Journal of Instrumentation, August 2008.
Article
@article{jinst-2008,
author = "Collaboration, The ATLAS",
title = "The ATLAS Experiment at the CERN Large Hadron Collider",
journal = "Journal of Instrumentation",
year = "2008",
month = "August",
OPTvolume = "",
number = "S08003",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/jinst-2008.pdf",
abstract = "The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented."
}
The ATLAS Collaboration. Expected performance of the atlas experiment detector, trigger, physics. Technical Report 2008–020, CERN Open Documentation, 2008.
Article
@techreport{cern-2008,
author = "Collaboration, The ATLAS",
title = "Expected Performance of the ATLAS Experiment Detector, Trigger, Physics",
institution = "CERN Open Documentation",
year = "2008",
OPTvolume = "",
number = "2008--020",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/cern-2008.pdf",
abstract = "A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN."
}
The ATLAS Collaboration. Readiness of the atlas trigger and data acquisition system for the first lhc beams. In 11th Topical Seminar On Innovative Particle And Radiation Detectors, Siena, Italy. 2008.
Article
@inproceedings{iprd-2008,
author = "Collaboration, The ATLAS",
title = "Readiness of the ATLAS Trigger and Data Acquisition system for the first LHC beams",
booktitle = "11th Topical Seminar On Innovative Particle And Radiation Detectors, Siena, Italy",
year = "2008",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/iprd-2008.pdf",
abstract = "The ATLAS Trigger and Data Acquisition (TDAQ) system is based on O(2k) processing nodes, interconnected by a multi-layer Gigabit network, and consists of a combination of custom electronics and commercial products. In its final configuration, O(20k) applications will provide the needed capabilities in terms of event selection, data flow, local storage and data monitoring. In preparation for the first LHC beams, many TDAQ sub-systems already reached the final configuration and roughly one third of the final processing power has been deployed. Therefore, the current system allows for a sensible evaluation of the performance and scaling properties. In this paper we introduce the ATLAS TDAQ system requirements and architecture and we discuss the status of software and hardware component. We moreover present the results of performance measurements validating the system design and providing a figure for the ATLAS data acquisition capabilities in the initial data taking period."
}
André Anjos on behalf of the ATLAS Collaboration. The daq/hlt system of the atlas experiment. In International Workshop on Advanced Computing and Analysis Techniques in Physics Research. 2008.
Article
@inproceedings{acat-2008,
author = "on behalf of the ATLAS Collaboration, André Anjos",
title = "The DAQ/HLT system of the ATLAS experiment",
booktitle = "International Workshop on Advanced Computing and Analysis Techniques in Physics Research",
year = "2008",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/acat-2008.pdf",
abstract = "The DAQ/HLT system of the ATLAS experiment at CERN, Switzerland, is being commissioned for first collisions in 2009. Presently, the system is composed of an already very large farm of computers that accounts for about one-third of its event processing capacity. Event selection is conducted in two steps after the hardware-based Level-1 Trigger: a Level-2 Trigger processes detector data based on regions of interest (RoI) and an Event Filter operates on the full event data assembled by the Event Building system. The detector readout is fully commissioned and can be operated at its full design capacity. This places on the High-Level Triggers system the responsibility to maximize the quality of data that will finally reach the offline reconstruction farms. This paper brings an overview of the current ATLAS DAQ/HLT implementation and performance based on studies originated from its operation with simulated, cosmic particles and first-beam data. Its built-in event processing parallelism is discussed for both HLT levels as well as an outlook of options to improve it."
}
André Anjos. Trigger systems. In Experimental High-Energy Physics and Associated Technologies Workshop. 2008.
Article
@inproceedings{talk-2008,
author = "Anjos, André",
title = "Trigger Systems",
booktitle = "Experimental High-Energy Physics and Associated Technologies Workshop",
year = "2008",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/talk-2008.pptx",
abstract = "This is an invited talk. The contents were based on the fundamentals of Triggering System in High-Energy Physics experiments."
}
2007
Thiago Ciodaro Xavier, André Rabello Anjos, and José Manoel de Seixas. Discriminação neural de partículas para um detector submetido a uma alta taxa de eventos. Learning and Nonlinear Models - Revista da Sociedade Brasileira de Redes Neurais (SBRN), 4(2):79–92, October 2007.
Article
@article{sbrn-2007,
author = "Xavier, Thiago Ciodaro and Anjos, André Rabello and de Seixas, José Manoel",
title = "Discriminação Neural de Partículas para um Detector Submetido a uma Alta Taxa de Eventos",
journal = "Learning and Nonlinear Models - Revista da Sociedade Brasileira de Redes Neurais (SBRN)",
year = "2007",
month = "October",
volume = "4",
number = "2",
pages = "79--92",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/sbrn-2007.pdf",
abstract = "This article (written in portuguese) presents the results of using neural networks for the optimization of the ATLAS online filtering system, one of the main detectors of the particle collider LHC (Large Hadron Collider). The Regions of interests of the ATLAS energy measurer calorimeter are mapped in 100 rings of energy deposition, which feed a classifier neural network to classify them as electron or jet. For the signal pre-processing, it is used a relevance mapping and PCA (Principal Component Analysis) to compact the information, increasing the processing speed and, eventually, increasing the detection efficiency, with a decreasing of false alarm rate."
}
The ATLAS Collaboration. Performance of the final event builder for the atlas experiment. In 15th IEEE Real Time Conference 2007. 2007.
Article
@inproceedings{rt-2007,
author = "Collaboration, The ATLAS",
title = "Performance of the final Event Builder for the ATLAS Experiment",
booktitle = "15th IEEE Real Time Conference 2007",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/rt-2007.pdf",
abstract = "Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three level trigger system, which reduces the initial bunch crossing rate of 40 MHz at its first two trigger levels (LVL1+LVL2) to \textasciitilde 3 kHz. At this rate the Event-Builder collects the data from all Read-Out system PCs (ROSs) and provides fully assembled events to the the Event-Filter (EF), which is the third level trigger, to achieve a further rate reduction to \textasciitilde 200 Hz for permanent storage. The Event-Builder is based on a farm of O(100) PCs, interconnected via Gigabit Ethernet to O(150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs and one third of the Event-Builder PCs are already installed and commissioned. We report on performance tests on this initial system, which show promising results to reach the final data throughput required for the ATLAS experiment."
}
André Anjos on behalf of the ATLAS Collaboration. The configuration system of the atlas trigger. In IEEE Real-time conference. 2007.
Article
@inproceedings{rt-2007-2,
author = "on behalf of the ATLAS Collaboration, André Anjos",
title = "The Configuration System of the ATLAS Trigger",
booktitle = "IEEE Real-time conference",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/rt-2007-2.pdf",
abstract = "The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions at a rate of 40 MHz. To reduce the data rate, only potentially interesting events are selected by a three-level trigger system. The first level is implemented in custom-made electronics, reducing the data output rate to less than 100 kHz. The second and third level are software triggers with a final output rate of 100 to 200 Hz. A system has been designed and implemented that holds and records the configuration information of all three trigger levels at a centrally maintained location. This system provides consistent configuration information to the online trigger for the purpose of data taking as well as to the offline trigger simulation. The use of relational database technology provides a means of reliable recording of the trigger configuration history over the lifetime of the experiment. Tools for flexible browsing of trigger configurations, and for their distribution across the ATLAS reconstruction sites have been developed. The usability of this design has been demonstrated in dedicated configuration tests of the ATLAS level-1 Central Trigger and of a 600-node software trigger computing farm. Further tests on a computing cluster which is part of the final high level trigger system were also successful."
}
The ATLAS Collaboration. Integration of the trigger and data acquisition systems in atlas. In IEEE Real-time conference. 2007.
Article
@inproceedings{rt-2007-3,
author = "Collaboration, The ATLAS",
title = "Integration of the Trigger and Data Acquisition systems in ATLAS",
booktitle = "IEEE Real-time conference",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/rt-2007-3.pdf",
abstract = "During 2006 and spring 2007, integration and commissioning of trigger and data acquisition (TDAQ) equipment in the ATLAS experimental area has progressed. Much of the work has focused on a final prototype setup consisting of around eighty computers representing a subset of the full TDAQ system. There have been a series of technical runs using this setup. Various tests have been run including ones where around 6k Level-1 pre-selected simulated proton-proton events have been processed in a loop mode through the trigger and dataflow chains. The system included the readout buffers containing the events, event building, second level and third level trigger algorithms. Quantities critical for the final system, such as event processing times, have been studied using different trigger algorithms as well as different dataflow components."
}
Rodrigo Coura Torres, José Manoel Seixas, André Rabello dos Anjos, and Danilo Vannier Cunha. Online electron/jet neural high-level trigger over indenpendent calorimetry information. In XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research. 2007.
Article
@inproceedings{acat-2007,
author = "Torres, Rodrigo Coura and Seixas, José Manoel and dos Anjos, André Rabello and Cunha, Danilo Vannier",
title = "Online Electron/Jet Neural High-Level Trigger over Indenpendent Calorimetry Information",
booktitle = "XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/acat-2007.pdf",
abstract = "A data volume of 60 TB/s is expected from the high LHC collision rate and high resolution of the ATLAS detectors. To cope with this bandwidth, a highly programmable, three-level online triggering system is under development. One of the main components of this system is an electron/jet discriminator that uses the highly segmented calorimetry information. In this work, we address the electron/jet discrimination at the second-level trigger by building a set of concentric ring sums around the energy deposition peak in each calorimeter segment. An Independent Component Analysis (ICA) on these ring sums is then performed to extract the main sources of the calorimeter signal. The extracted independent components feed the input nodes of a neural electron/jet discriminator. The proposed system is able to achieve higher detection efficiency than the current electron/jet discriminating system operating in ATLAS, while being fast enough to cope with the time restrictions of the ATLAS triggering system operation."
}
The ATLAS Collaboration. The atlas trigger - commissioning with cosmic rays. In International Conference on Computing in High Energy and Nuclear Physics. 2007.
Article
@inproceedings{chep-2007-2,
author = "Collaboration, The ATLAS",
title = "The ATLAS Trigger - Commissioning with cosmic rays",
booktitle = "International Conference on Computing in High Energy and Nuclear Physics",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2007-2.pdf",
abstract = "The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions from beams crossing at 40 MHz. At the design luminosity there are roughly 23 collisions per bunch crossing. ATLAS has designed a three-level trigger system to select potentially interesting events. The first-level trigger, implemented in custom-built electronics, reduces the incoming rate to less than 100 kHz with a total latency of less than 2.5s. The next two trigger levels run in software on commercial PC farms. They reduce the output rate to 100-200 Hz. In preparation for collision data-taking which is scheduled to commence in May 2008, several cosmic-ray commissioning runs have been performed. Among the first sub-detectors available for commissioning runs are parts of the barrel muon detector including the RPC detectors that are used in the first-level trigger. Data have been taken with a full slice of the muon trigger and readout chain, from the detectors in one sector of the RPC system, to the second-level trigger algorithms and the data-acquisition system. The system is being prepared to include the inner-tracking detector in the readout and second-level trigger. We will present the status and results of these cosmic-ray based commissioning activities. This work will prove to be invaluable not only during the commissioning phase but also for cosmic-ray data-taking during the normal running for detector performance studies."
}
The ATLAS Collaboration. Alignment data streams for the atlas inner detector. In Computing for High-Energy Physics. 2007.
Article
@inproceedings{chep-2007,
author = "Collaboration, The ATLAS",
title = "Alignment data streams for the ATLAS Inner Detector",
booktitle = "Computing for High-Energy Physics",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2007.pdf",
abstract = "The ATLAS experiment uses a complex trigger strategy to be able to reduce the Event Filter rate output, down to a level that allows the storage and processing of these data. These concepts are described in the ATLAS Computing Model which embraces Grid paradigm. The output coming from the Event Filter consists of four main streams: physical stream, express stream, calibration stream, and diagnostic stream. The calibration stream will be transferred to the Tier-0 facilities that will provide the prompt reconstruction of this stream with a minimum latency of 8 hours, producing calibration constants of sufficient quality to allow a first-pass processing. The Inner Detector community is developing and testing an independent common calibration stream selected at the Event Filter after track reconstruction. It is composed of raw data, in byte-stream format, contained in Readout Buffers (ROBs) with hit information of the selected tracks, and it will be used to derive and update a set of calibration and alignment constants. This option was selected because it makes use of the Byte Stream Converter infrastructure and possibly gives better bandwidth usage and storage optimization. Processing is done using specialized algorithms running in the Athena framework in dedicated Tier-0 resources, and the alignment constants will be stored and distributed using the COOL conditions database infrastructure. This work is addressing in particular the alignment requirements, the needs for track and hit selection, and the performance issues."
}
The ATLAS Collaboration. The atlas trigger - high-level trigger commissioning and operation during early data taking. In International Europhysics Conference on High Energy Physics. 2007.
Article
@inproceedings{eurochep-2007,
author = "Collaboration, The ATLAS",
title = "The ATLAS trigger - high-level trigger commissioning and operation during early data taking",
booktitle = "International Europhysics Conference on High Energy Physics",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/eurochep-2007.pdf",
abstract = "The ATLAS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14\textasciitilde TeV, with a bunch-crossing rate of 40\textasciitilde MHz. The ATLAS three-level trigger will reduce this input rate to match the foreseen offline storage capability of 100-200\textasciitilde Hz. This paper gives an overview of the ATLAS High Level Trigger focusing on the system design and its innovative features. We then present the ATLAS trigger strategy for the initial phase of LHC exploitation. Finally, we report on the valuable experience acquired through in-situ commissioning of the system where simulated events were used to exercise the trigger chain. In particular we show critical quantities such as event processing times, measured in a large-scale HLT farm using a complex trigger menu."
}
The ATLAS Collaboration. The atlas event builder. In IEEE Nuclear Science Symposium and Medical Imaging Conference. 2007.
Article
@inproceedings{nss-2007,
author = "Collaboration, The ATLAS",
title = "The ATLAS Event Builder",
booktitle = "IEEE Nuclear Science Symposium and Medical Imaging Conference",
year = "2007",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nss-2007.pdf",
abstract = "Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three-level trigger system, which, at its first two trigger levels (LVL1+LVL2), reduces the initial bunch crossing rate of 40 MHz to ∼3 kHz. At this rate, the Event Builder collects the data from the readout system PCs (ROSs) and provides fully assembled events to the Event Filter (EF). The EF is the third trigger level and its aim is to achieve a further rate reduction to ∼200 Hz on the permanent storage. The Event Builder is based on a farm of O(100)."
}
2006
The ATLAS Collaboration. The atlas data acquisition and trigger : concept, design and status. Nucl. Phys. B, Proc. Suppl., 172:178–182, November 2006.
Article
@article{nimb-2006,
author = "Collaboration, The ATLAS",
title = "The ATLAS Data Acquisition and Trigger : concept, design and status",
journal = "Nucl. Phys. B, Proc. Suppl.",
year = "2006",
month = "November",
volume = "172",
OPTnumber = "",
pages = "178--182",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nimb-2006.pdf",
abstract = "This article presents the base-line design and implementation of the ATLAS Trigger and Data Acquisition system, in particular the Data Flow and High Level Trigger components. The status of the installation and commissioning of the system is also presented."
}
André Anjos on behalf of the ATLAS Collaboration. Deployment of the atlas high-level trigger. IEEE Transactions on Nuclear Science, 53:2144–2149, August 2006.
Article
@article{ieee-tns-2006,
author = "on behalf of the ATLAS Collaboration, André Anjos",
title = "Deployment of the ATLAS High-Level Trigger",
journal = "IEEE Transactions on Nuclear Science",
year = "2006",
month = "August",
volume = "53",
OPTnumber = "",
pages = "2144--2149",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2006.pdf",
abstract = "The ATLAS combined test beam in the second half of 2004 saw the first deployment of the ATLAS High-Level Trigger (HLT). The next steps are deployment on the pre-series farms in the experimental area during 2005, commissioning and cosmics tests with the full detector in 2006 and collisions in 2007. This paper reviews the experience gained in the test beam, describes the current status and discusses the further enhancements to be made. We address issues related to the dataflow, integration of selection algorithms, testing, software distribution, installation and improvements."
}
A. Anjos, R.C. Torres, J.M. Seixas, B.C. Ferreira, and T.C. Xavier. Neural triggering system operating on high resolution calorimetry information. Nuclear Instruments and Methods in Physics Research, 559:134–138, April 2006.
Article
@article{nima-2006,
author = "Anjos, A. and Torres, R.C. and Seixas, J.M. and Ferreira, B.C. and Xavier, T.C.",
title = "Neural triggering system operating on high resolution calorimetry information",
journal = "Nuclear Instruments and Methods in Physics Research",
year = "2006",
month = "April",
volume = "559",
OPTnumber = "",
pages = "134--138",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nima-2006.pdf",
abstract = "This paper presents an electron/jet discriminator system for operating at the Second Level Trigger of ATLAS. The system processes calorimetry data and organizes the regions of interest in the calorimeter in the form of concentric ring sums of energy deposition, so that both signal compaction and high performance can be achieved. The ring information is fed into a feed forward neural discriminator. This implementation resulted on a 97\\% electron detection efficiency for a false alarm of 3\\%. The full discrimination chain could still be executed in less than 500 microseconds."
}
André Anjos on behalf of the ATLAS Collaboration. A configuration system for the atlas trigger. Journal of Instrumentation, Institute of Physics Publishing and Sissa, February 2006.
Article
@article{jinst-2006,
author = "on behalf of the ATLAS Collaboration, André Anjos",
title = "A configuration system for the ATLAS trigger",
journal = "Journal of Instrumentation, Institute of Physics Publishing and Sissa",
year = "2006",
month = "February",
OPTvolume = "",
number = "P05004",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/jinst-2006.pdf",
abstract = "The ATLAS detector at CERN's Large Hadron Collider will be exposed to proton–proton collisions from beams crossing at 40 MHz that have to be reduced to the few hundreds of Hz allowed by the storage systems. A three-level trigger system has been designed to achieve this goal. We describe the configuration system under construction for the ATLAS trigger chain. It provides the trigger system with all the parameters required for decision taking and to record its history. The same system configures the event reconstruction, Monte Carlo simulation and data analysis, and provides tools for accessing and manipulating the configuration data in all contexts."
}
The ATLAS Collaboration. Testing on a large scale: running the atlas data acquisition and high level trigger software on 700 pc nodes. In Computing In High Energy and Nuclear Physics. 2006.
Article
@inproceedings{chep-2006-3,
author = "Collaboration, The ATLAS",
title = "Testing on a Large Scale: running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes",
booktitle = "Computing In High Energy and Nuclear Physics",
year = "2006",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2006-3.doc",
abstract = "The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. The findings obtained during the tests lead to many immediate improvements in the software. Trend analysis allowed identifying critical areas. Running an online system on a cluster of 700 nodes successfully was found to be especially sensitive to the reliability of the farm as well as the DAQ/HLT system itself and the future development will concentrate on fault tolerance and stability."
}
The ATLAS Collaboration. Atlas high level trigger infrastructure, roi collection and event building. In 15th International Conference on Computing In High Energy and Nuclear Physics. 2006.
Article
@inproceedings{chep-2006-2,
author = "Collaboration, The ATLAS",
title = "ATLAS High Level Trigger Infrastructure, ROI Collection and Event Building",
booktitle = "15th International Conference on Computing In High Energy and Nuclear Physics",
year = "2006",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2006-2.pdf",
abstract = "We describe the base-line design and implementation of the Data Flow and High Level Trigger (HLT) part of the ATLAS Trigger and Data Acquisition (TDAQ) system. We then discuss improvements and generalization of the system design to allow the handling of events in parallel data streams and we present the possibility for event duplication, partial Event Building and data stripping. We then present tests on the deployment and integration of the TDAQ infrastructure and algorithms at the TDAQ \'pre-series\' cluster (\textasciitilde 10\\% of full ATLAS TDAQ). Finally, we tackle two HLT performance issues."
}
The ATLAS Collaboration. Studies with the atlas trigger and data acquisition pre-series setup. In 15th International Conference on Computing In High Energy and Nuclear Physics. 2006.
Article
@inproceedings{chep-2006,
author = "Collaboration, The ATLAS",
title = "Studies with the ATLAS Trigger and Data Acquisition Pre-Series Setup",
booktitle = "15th International Conference on Computing In High Energy and Nuclear Physics",
year = "2006",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2006.pdf",
abstract = "The pre-series test bed is used to validate the technology and implementation choices by comparing the final ATLAS readout requirements, to the results of performance, functionality and stability studies. We show that all the components which are not running reconstruction algorithms match the final ATLAS requirements. For the others, we calculate the amount of time per event that could be allocated to run these not-yet-finalized algorithms. We also report on the experience gained during these studies while interfacing with a sub-detector for the first time at the experimental area."
}
André Anjos. Sistema Online de Filtragem em um Ambiente com Alta Taxa de Eventos. PhD thesis, COPPE/UFRJ, 2006.
Article
@phdthesis{phd-thesis-2006,
author = "Anjos, André",
title = "Sistema Online de Filtragem em um Ambiente com Alta Taxa de Eventos",
school = "COPPE/UFRJ",
year = "2006",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/phd-thesis-2006.zip",
abstract = "The ATLAS experiment at CERN, Switzerland, will count on a triggering system that separates the ordinary physics from the one representing decays of the rare Higgs boson. The Second Level of such a Trigger system will be composed 1,000 computers connected by commodity networks, processing each event approved the First Level Trigger in no more than 10 milliseconds. A set of algorithms described via software will operate in this filtering level. Among them, electron detection systems play a fundamental role to the data acquisition since the existence of these particles can represent interesting physics. In this work, we present more efficient discrimination algorithms based on artificial neural networks and a compaction system which benefits from the energy deposit profiles of these particles in calorimeters, reaching a classification efficiency of 97.6\\% for electrons for a false-alarm of only 3.2\\% in jets. This detection algorithm is implemented as part of the experiment\'s complex software infraestructure and can be executed in only 125 microseconds."
}
2005
A. Anjos, R.C. Torres, B.C. Ferreira, T.C. Xavier, and J.M. de Seixas. Discriminação neural de elétrons no segundo nível de trigger do atlas. In XXVI Encontro Nacional de Física de Partículas e Campos. October 2005.
Article
@inproceedings{enfpc-2005,
author = "Anjos, A. and Torres, R.C. and Ferreira, B.C. and Xavier, T.C. and de Seixas, J.M.",
title = "Discriminação Neural de Elétrons no Segundo Nível de Trigger do ATLAS",
booktitle = "XXVI Encontro Nacional de Física de Partículas e Campos",
year = "2005",
month = "October",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/enfpc-2005.pdf",
abstract = "Este trabalho apresenta um discriminador neural para o segundo nível de filtragem do ATLAS, atuando no problema de separação elétron/jato baseado em informações de calorimetria. Para reduzir a alta dimensionalidade dos dados de entrada, as regiões de interesse (RoI) identificadas no primeiro nível são organizadas em anéis concêntricos de deposição energética. Este tipo de pré-processamento dos dados permite eficiente compactação dos sinais e alcança elevada capacidade de identificar elétrons. Atualmente, esse sistema vem sendo portado para o ambiente de emulação do sistema de filtragem ATHENA, de modo a se obter uma avaliação realística de seu desempenho. O ambiente tem o objetivo de simular o comportamento do sistema de filtragem, ajudando, desta forma, no desenvolvimento e validação dos algoritmos. Em caráter comparativo, o sistema proposto foi também implementado usando a tecnologia DSP."
}
A. Anjos, R.C. Torres, B.C. Ferreira, T.C. Xavier, J.M Seixas, and D.O. Damazio. Otimização do sistema de trigger do segundo nível do atlas baseado em calorimetria. In XXVI Encontro Nacional de Física de Partículas e Campos. October 2005.
Article
@inproceedings{enfpc-2005-2,
author = "Anjos, A. and Torres, R.C. and Ferreira, B.C. and Xavier, T.C. and Seixas, J.M and Damazio, D.O.",
title = "Otimização do Sistema de Trigger do Segundo Nível do ATLAS Baseado em Calorimetria",
booktitle = "XXVI Encontro Nacional de Física de Partículas e Campos",
year = "2005",
month = "October",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/enfpc-2005-2.pdf",
abstract = "Este trabalho apresenta um discriminador neural que opera sobre as quantidades calculadas pelo algoritmo T2Calo, responsável pela deteção elétron/jato no Segundo Nível de Filtragem do experimento ATLAS. Este sistema de deteção melhora a eficiência de deteção em quase 10 pontos percentuais, mantendo um nível de desempenho compatível com as restrições operacionais do sistema de filtragem."
}
André Anjos on behalf of the ATLAS Collaboration. Configuration of the atlas trigger. In 14th IEEE NPSS Real Time Conference, 990–994. June 2005.
Article
@inproceedings{rt-2005,
author = "on behalf of the ATLAS Collaboration, André Anjos",
title = "Configuration of the ATLAS trigger",
booktitle = "14th IEEE NPSS Real Time Conference",
year = "2005",
month = "June",
pages = "990--994",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/rt-2005.pdf",
abstract = "The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions at a rate of 40 MHz. In order to reduce the data rate to about 200 Hz, only potentially interesting events are selected by a three-level trigger system. Its first level is implemented in electronics and firmware whereas the higher trigger levels are based on software. To prepare the full trigger chain for the online event selection according to a certain strategy, a system is being set up that provides the relevant configuration information - e.g. values for hardware registers in level-1 or parameters of high-level trigger algorithms - and stores the corresponding history. The same information is used to configure the offline trigger simulation. In this presentation an overview of the ATLAS trigger system is given concentrating on the event selection strategy and its description. The technical implementation of the configuration system is summarized."
}
The ATLAS Collaboration. Atlas dataflow: the read-out subsystem, results from trigger and data-acquisition system testbed studies and from modeling. IEEE Trans. Nucl. Sciences, 53 (2006):912–917, June 2005.
Article
@article{ieee-tns-2005,
author = "Collaboration, The ATLAS",
title = "ATLAS DataFlow: the Read-Out Subsystem, Results from Trigger and Data-Acquisition System Testbed Studies and from Modeling",
journal = "IEEE Trans. Nucl. Sciences",
year = "2005",
month = "June",
volume = "53 (2006)",
pages = "912--917",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2005.pdf",
abstract = "In the ATLAS experiment at the LHC, the output of readout hardware specific to each subdetector will be transmitted to buffers, located on custom made PCI cards (ROBINs). The data consist of fragments of events accepted by the first-level trigger at a maximum rate of 100 kHz. Groups of four ROBINs will be hosted in about 150 Read-Out Subsystem (ROS) PCs. Event data are forwarded on request via Gigabit Ethernet links and switches to the second-level trigger or to the Event builder. In this paper a discussion of the functionality and real-time properties of the ROS is combined with a presentation of measurement and modelling results for a testbed with a size of about 20\\% of the final DAQ system. Experimental results on strategies for optimizing the system performance, such as utilization of different network architectures and network transfer protocols, are presented for the testbed, together with extrapolations to the full system."
}
The ATLAS Collaboration. Implementation and performance of the seeded reconstruction for the atlas event filter selection software. IEEE Trans. Nucl. Sciences, 53 (2007):864–869, June 2005.
Article
@article{ieee-tns-2005-2,
author = "Collaboration, The ATLAS",
title = "Implementation and Performance of the Seeded Reconstruction for the ATLAS Event Filter Selection Software",
journal = "IEEE Trans. Nucl. Sciences",
year = "2005",
month = "June",
volume = "53 (2007)",
pages = "864--869",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-2005-2.pdf",
abstract = "ATLAS is one of the four LHC experiments that will start data taking in 2007, designed to cover a wide range of physics topics. The ATLAS trigger system has to cope with a rate of 40 MHz and 23 interactions per bunch crossing. It is divided in three different levels. The first one (hardware based) provides a signature that is confirmed by the the following trigger levels (software based) by running a sequence of algorithms and validating the signal step by step, looking only to the region of the space indicated by the first trigger level (seeding). In this paper, the performance of one of these sequences that run at the Event Filter level (third level) and is composed of clustering at the calorimeter, track reconstruction and matching."
}
The ATLAS Collaboration. Overview of the high-level trigger electron and photon selection for the atlas experiment at the lhc. IEEE Transactions Nuclear Sciences (2005), 53:2839–2843, June 2005.
Article
@article{ieee-tns-2005-3,
author = "Collaboration, The ATLAS",
title = "Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC",
journal = "IEEE Transactions Nuclear Sciences (2005)",
year = "2005",
month = "June",
volume = "53",
OPTnumber = "",
pages = "2839--2843",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2005-3.pdf",
abstract = "The ATLAS experiment at the Large Hadron Collider (LHC) will face the challenge of efficiently selecting interesting candidate events in pp collisions at 14 TeV center-of-mass energy, whilst rejecting the enormous number of background events. The High-Level Trigger (HLT = second level trigger and Event Filter), which is a software based trigger will need to reduce the level-1 output rate of \textasciitilde 75 kHz to \textasciitilde 200 Hz written out to mass storage. In this talk an overview of the current physics and system performance of the HLT selection for electrons and photons is given. The performance has been evaluated using Monte Carlo simulations and has been partly demonstrated in the ATLAS testbeam in 2004. The efficiency for the signal channels, the rate expected for the selection, the global data preparation and execution times will be highlighted. Furthermore, some physics examples will be discussed to demonstrate that the triggers are well adapted for the physics programme envisaged at the LHC."
}
The ATLAS Collaboration. Implementation and performance of a tau lepton selection within the atlas trigger system at the lhc. In 9th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications. 2005.
Article
@inproceedings{icatpp-2005,
author = "Collaboration, The ATLAS",
title = "Implementation and performance of a tau lepton selection within the ATLAS trigger system at the LHC",
booktitle = "9th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications",
year = "2005",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icatpp-2005.pdf",
abstract = "The ATLAS experiment at the Large Hadron Collider (LHC) has an interaction rate of up to 1 GHz. The trigger must efficiently select interesting events while rejecting the large amount of background. The First Level trigger will reduce this rate to around O(75 kHz ). Subsequently, the High Level Trigger (HLT), comprising the Second Level trigger and the Event Filter, will reduce this rate by a factor of O(1000). Triggering on taus is important for Higgs and SUSY searches at the LHC. In this paper tau trigger selections are presented based on a lepton trigger if the tau decays leptonically or via a dedicated tau hadron trigger if the tau disintegrates semileptonically. We present signal efficiency with the electron trigger using the data sample A=tau+tau=e+hadron, and rate studies obtained from the dijet sample."
}
The ATLAS Collaboration. Muon reconstruction and identification for the event filter of the atlas experiment. In 9th ICATAPP Conference on High Energy Physics. 2005.
Article
@inproceedings{icatapp-2005-2,
author = "Collaboration, The ATLAS",
title = "Muon Reconstruction and Identification for the Event Filter of the ATLAS experiment",
booktitle = "9th ICATAPP Conference on High Energy Physics",
year = "2005",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icatpp-2005-2.pdf",
abstract = "The ATLAS Trigger requires high efficiency and selectivity in order to keep the full physics potential of the experiment and to reject uninteresting processes from the 40 MHz event production rate of the LHC. These goals are achieved with a trigger composed of three sequential levels of increasing accuracy that have to reduce the output event rate down to \textasciitilde 100 Hz. This work focuses on muon reconstruction and identification for the third level (Event Filter), for which specific algorithms from the off-line environment have been adapted to work in the trigger framework. Two different strategies for accessing data (wrapped and seeded modes) are described and their reconstruction potential is then shown in terms of efficiencies, resolutions and fake muon rejection power."
}
2004
The ATLAS Collaboration. Portable gathering system for monitoring and online calibration at atlas. In Computing in High Energy Physics and Nuclear Physics 2004. October 2004.
Article
@inproceedings{chep-2004,
author = "Collaboration, The ATLAS",
title = "Portable Gathering System for Monitoring and Online Calibration at ATLAS",
booktitle = "Computing in High Energy Physics and Nuclear Physics 2004",
year = "2004",
month = "October",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2004.pdf",
abstract = "During the runtime of any experiment, a central monitoring system that detects problems as soon as they appear has an essential role. In a large experiment, like ATLAS, the online data acquisition system is distributed across the nodes of large farms, each of them running several processes that analyse a fraction of the events. In this architecture, it is necessary to have a central process that collects all the monitoring data from the different nodes, produces full statistics histograms and analyses them. In this paper we present the design of such a system, called the gatherer. It allows to collect any monitoring object, such as histograms, from the farm nodes, from any process in the DAQ, trigger and reconstruction chain. It also adds up the statistics, if required, and processes user defined algorithms in order to analyse the monitoring data. The results are sent to a centralized display, that shows the information online, and to the archiving system, triggering alarms in case of problems. The innovation of this system is that conceptually it abstracts several underlying communication protocols, being able to talk with different processes using different protocols at the same time and, therefore, providing maximum flexibility. The software is easily adaptable to any trigger-DAQ system. The first prototype of the gathering system has been implemented for ATLAS and has been running during this year\'s combined test beam. An evaluation of this first prototype will also be presented."
}
The ATLAS Collaboration. Performance of the atlas daq dataflow system. In Computing in High Energy Physics and Nuclear Physics. October 2004. doi:10.5170/CERN-2005-002.91.
Article
@inproceedings{chep-2004-2,
author = "Collaboration, The ATLAS",
title = "Performance of the ATLAS DAQ DataFlow system",
booktitle = "Computing in High Energy Physics and Nuclear Physics",
year = "2004",
month = "October",
doi = "10.5170/CERN-2005-002.91",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2004-2.pdf",
abstract = "The baseline DAQ architecture of the ATLAS Experiment at LHC is introduced and its present implementation and the performance of the DAQ components as measured in a laboratory environment are summarized. It will be shown that the discrete event simulation model of the DAQ system, tuned using these measurements, does predict the behaviour of the prototype configurations well, after which, predictions for the final ATLAS system are presented. With the currently available hardware and software, a system using \textasciitilde 140 ROSs with 3GHz single cpu, \textasciitilde 100 SFIs with dual 2.4 GHz cpu and \textasciitilde 500 L2PUs with dual 3.06 GHz cpu."
}
The ATLAS Collaboration. Design, deployment and functional tests of the on-line event filter for the atlas experiment at lhc. In Nuclear Science Symposium and Medical Imaging Conference. October 2004.
Article
@inproceedings{nss-2004-2,
author = "Collaboration, The ATLAS",
title = "Design, deployment and functional tests of the on-line Event Filter for the ATLAS experiment at LHC",
booktitle = "Nuclear Science Symposium and Medical Imaging Conference",
year = "2004",
month = "October",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nss-2004-2.pdf",
abstract = "The Event Filter selection stage is a fundamental component of the ATLAS Trigger and Data Acquisition architecture. Its primary function is the reduction of data flow and rate to values acceptable by the mass storage operations and by the subsequent off-line data reconstruction and analysis steps. The computing instrument of the EF is generally organized as a set of independent sub-farms, each connected to one output of the Event Builder switch fabric. Each sub-farm comprises a number of processors analyzing several complete events in parallel. This paper describes the design of the ATLAS EF system, its deployment in the 2004 ATLAS combined test beam together with some examples of integrating selection and monitoring algorithms. Since the processing algorithms are not specially designed for EF but are inherited as much as possible from the off-line ones, special emphasis is reserved to system reliability and data security, in particular for the case of failures in the processing algorithms. Another key design element has been system modularity and scalability. The EF shall be able to follow technology evolution and should allow for using additional processing resources possibly remotely located."
}
The ATLAS Collaboration. Implementation and performance of the high level trigger electron and photon selection for the atlas experiment at the lhc. In IEEE Nuclear Science Symposium and Medical Imaging Conference. October 2004.
Article
@inproceedings{nss-2004-3,
author = "Collaboration, The ATLAS",
title = "Implementation and Performance of the High Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC",
booktitle = "IEEE Nuclear Science Symposium and Medical Imaging Conference",
year = "2004",
month = "October",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nss-2004-3.pdf",
abstract = "The ATLAS experiment at the Large Hadron Collider (LHC) will face the challenge of efficiently selecting interesting candidate events in pp collisions at 14 TeV center of mass energy, while rejecting the enormous number of background events, stemming from an interaction rate of up to 10^9 Hz. The First Level trigger will reduce this rate to around O(100 kHz). Subsequently, the High Level Trigger (HLT), which is comprised of the Second Level trigger and the Event Filter, will need to further reduce this rate by a factor of O(10^3). The HLT selection is software based and will be implemented on commercial CPUs, using a common framework built on the standard ATLAS object oriented software architecture. In this paper an overview of the current implementation of the selection for electrons and photons in the HLT is given. The performance of this implementation has been evaluated using Monte Carlo simulations in terms of the efficiency for the signal channels, rate expected for the selection, data preparation times, and algorithm execution times. Besides the efficiency and rate estimates, some physics examples will be discussed, showing that the triggers are well adapted for the physics programme envisaged at LHC. The electron and photon trigger software is also being exercised at the ATLAS 2004 Combined Test Beam, where components from all ATLAS subdetectors are taking data together along the H8 SPS extraction line; from these tests a validation of the selection architecture chosen in a real on-line environment is expected."
}
A. Anjos and J.M. Seixas. Os filtros de alto nível do experimento atlas. In XXVI Encontro Nacional de Física de Partículas e Campos. August 2004.
Article
@inproceedings{enfpc-2004,
author = "Anjos, A. and Seixas, J.M.",
title = "Os Filtros de Alto Nível do Experimento ATLAS",
booktitle = "XXVI Encontro Nacional de Física de Partículas e Campos",
year = "2004",
month = "August",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/enfpc-2004.pdf",
abstract = "O Experimento ATLAS conta com um sistema de Filtragem bastante complexo e dividido em 4 grandes sub-sistemas: (i) O Primeiro Nível de Filtragem que realiza os primeiros passos da seleção de eventos na cadeia de filtragem; (ii) O Software Online, responsável pelas áreas de controle e operação do sistema;(iii) O Sistema de Fluxo de Dados (Dataflow) que coordena a transmissão e armazenamento dos dados do detetor do experimento; (iv) Os Filtros de Alto Nível, que implementam os algoritmos de discriminação, representando o topo da cadeia de seleção de eventos no ATLAS. Para aumentar a portabilidade entre os algoritmos de filtragem desenvolvidos por toda a comunidade do experimento, os desenvolvedores dos Filtros de Alto Nível (ou simplesmente HLT; High-Level Triggers) propuseram a reutilização do ambiente de programação online Athena dentro do sistema que operará em tempo real. Para tal, o HLT utiliza as ferramentas propostas pelo sub-sistema de Fluxo de Dados para coordenar as operações da transferência de informação para dentro e para fora dos nós de processamento sistema. Dentre outras restrições, o produto final deverá ser suficientemente rápido e operável em tarefas concorrentes em máquinas (SMP) com vários processadores rodando Linux. Neste trabalho apresentamos alguns dos problemas e soluções encontrados pelo grupo no desenvolvimento e teste do conjunto de bibliotecas que compõe o HLT."
}
André Anjos on behalf of the ATLAS Collaboration. The second level trigger of the atlas experiment at cern's lhc. IEEE Transaction on Nuclear Science, 51(3):909–914, July 2004.
Article
@article{ieee-tns-2004-6,
author = "on behalf of the ATLAS Collaboration, André Anjos",
title = "The Second Level Trigger of the ATLAS Experiment at CERN's LHC",
journal = "IEEE Transaction on Nuclear Science",
year = "2004",
month = "July",
volume = "51",
number = "3",
pages = "909--914",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004-6.pdf",
abstract = "The Trigger System of the ATLAS experiment reduces the rate of events produced by proton-proton collisions at CERN's Large Hadron Collider (LHC) in three successive steps from 40 MHz to \textasciitilde\ 100, 1 and 0.2 kHz respectively. The ATLAS Second Level Trigger is original in several ways. It makes use of information provided by the First Level Trigger which identifies Regions of Interest (RoI) indicating where the most significant activity has occurred within the detector. Accessing detector data in RoIs only reduces the estimated 100 Gbytes/s data rate by a factor 100. Appart from a custom interface to acquire the RoI information, the Second Level Trigger is implemented in software. Another cost saving approach is the development of Trigger Selection software in an offline environment using a common framework for the High Level Trigger and Reconstruction Software. Consequently, the Second Level Trigger draws on software developed in two largely independant domains: real time oriented dataflow software combined with offline selection software. In this paper we report on experience gained and results obtained with second generation prototype software of both domains. Test of the performance of the data collection have been carried out on testbeds consisting of PCs running Linux and interconnected by Gbit Ethernet switches. The selection software has been tested using simulated detector data preloaded in detector readout buffers."
}
The ATLAS Collaboration. Studies for a common selection software environment in atlas : from the level-2 trigger to the offline reconstruction. IEEE Transactions on Nuclear Science, 51(3):915–920, June 2004.
Article
@article{ieee-tns-2004-2,
author = "Collaboration, The ATLAS",
title = "Studies for a common selection software environment in ATLAS : from the Level-2 Trigger to the offline reconstruction",
journal = "IEEE Transactions on Nuclear Science",
year = "2004",
month = "June",
volume = "51",
number = "3",
pages = "915--920",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004-2.pdf",
abstract = "The ATLAS High Level Trigger\'s primary function of event selection will be accomplished with a Level-2 trigger farm and an Event Filter farm, both running software components developed in the Atlas offline reconstruction framework. While this approach provides a unified software framework for event selection, it poses strict requirements on offline components critical for the Level-2 trigger. A Level-2 decision in Atlas must typically be accomplished within 10 ms and with multiple event processing in concurrent threads. In order to address these constraints, prototypes have been developed that incorporate elements of the Atlas Data Flow -, High Level Trigger -, and offline framework software. To realize a homogeneous software environment for offline components in the High Level Trigger, the Level-2 Steering Controller was developed. With electron/gamma- and muon-selection slices it has been shown that the required performance can be reached, if the offline components used are carefully designed and optimized for the application in the High Level Trigger."
}
The ATLAS Collaboration. Atlas tdaq data collection software. IEEE Transactions on Nuclear Science, 51:585–590, June 2004.
Article
@article{ieee-tns-2004-3,
author = "Collaboration, The ATLAS",
title = "ATLAS TDAQ data collection software",
journal = "IEEE Transactions on Nuclear Science",
year = "2004",
month = "June",
volume = "51",
OPTnumber = "",
pages = "585--590",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004-3.pdf",
abstract = "The DataCollection (DC) is a subsystem of the ATLAS Trigger and DAQ system. It is responsible for the movement of event data from the ReadOut subsystem to the Second Level Trigger and to the Event Filter. This functionality is distributed on several software applications running on Linux PCs interconnected with Gigabit Ethernet. For the design and implementation of these applications a common approach has been adopted. This approach leads to the design and implementation of a common DC software framework providing a suite of common services."
}
The ATLAS Collaboration. The base-line dataflow system of the atlas trigger and daq. IEEE Transactions on Nuclear Science, 51(3):470–475, June 2004.
Article
@article{ieee-tns-2004-4,
author = "Collaboration, The ATLAS",
title = "The base-line DataFlow system of the ATLAS Trigger and DAQ",
journal = "IEEE Transactions on Nuclear Science",
year = "2004",
month = "June",
volume = "51",
number = "3",
pages = "470--475",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004-4.pdf",
abstract = "The base-line design and implementation of the ATLAS DAQ DataFlow system is described. The main components realizing the DataFlow system, their interactions, bandwidths and rates are being discussed and performance measurements on a 10\\% scale prototype for the final Atlas TDAQ DataFlow system are presented. This prototype is a combination of custom design components and of multi-threaded software applications implemented in C++ and running in a Linux environment on commercially available PCs interconnected by a fully switched gigabit Ethernet network."
}
The ATLAS Collaboration. Algorithms for the atlas high-level trigger. IEEE Transactions on Nuclear Science, 51(3):367–374, June 2004.
Article
@article{ieee-tns-2004-5,
author = "Collaboration, The ATLAS",
title = "Algorithms for the ATLAS high-level trigger",
journal = "IEEE Transactions on Nuclear Science",
year = "2004",
month = "June",
volume = "51",
number = "3",
pages = "367--374",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004-5.pdf",
abstract = "Following rigorous software design and analysis methods, an object-based architecture has been developed to derive the second- and third-level trigger decisions for the future ATLAS detector at the LHC. The functional components within this system responsible for generating elements of the trigger decisions are algorithms running within the software architecture. Relevant aspects of the architecture are reviewed along with concrete examples of specific algorithms and their performance in \'vertical\' slices of various physics selection strategies."
}
J.T. Baines, C.P. Bee, A. Bogaerts, M. Bosman, D. Botterill, B. Caron, A. Anjos, F. Etienne, S. González, K. Karr, W. Li, C. Meessen, G. Merino, A. Negri, J. L. Pinfold, P. Pinto, Z. Qian, F. Touchard, P. Werner, S. Wheeler, F.J. Wickens, W. Wiedenmann, and G. Zobernig. An overview of the atlas high-level trigger dataflow and supervision. IEEE Transaction on Nuclear Science, 51(3):361–366, June 2004.
Article
@article{ieee-tns-2004-7,
author = "Baines, J.T. and Bee, C.P. and Bogaerts, A. and Bosman, M. and Botterill, D. and Caron, B. and Anjos, A. and Etienne, F. and González, S. and Karr, K. and Li, W. and Meessen, C. and Merino, G. and Negri, A. and Pinfold, J. L. and Pinto, P. and Qian, Z. and Touchard, F. and Werner, P. and Wheeler, S. and Wickens, F.J. and Wiedenmann, W. and Zobernig, G.",
title = "An Overview of the ATLAS High-Level Trigger Dataflow and Supervision",
journal = "IEEE Transaction on Nuclear Science",
year = "2004",
month = "June",
volume = "51",
number = "3",
pages = "361--366",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004-7.pdf",
abstract = ""
}
The ATLAS Collaboration. Architecture of the atlas high level trigger event selection software. Nucl. Instrum. Methods Phys. Res., 518(1–2):537–541, February 2004.
Article
@article{nima-2004,
author = "Collaboration, The ATLAS",
title = "Architecture of the ATLAS high level trigger event selection software",
journal = "Nucl. Instrum. Methods Phys. Res.",
year = "2004",
month = "February",
volume = "518",
number = "1--2",
pages = "537--541",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nima-2004.pdf",
abstract = "The ATLAS High Level Trigger (HLT) consists of two selection steps: the second level trigger and the event filter. Both will be implemented in software, running on mostly commodity hardware. Both levels have a coherent approach to event selection, so a common core software framework has been designed to maximize this coherency, while allowing sufficient flexibility to meet the different interfaces and requirements of the two different levels. The approach is extended further to allow the software to run in an off-line simulation and reconstruction environment for the purposes of development. This paper describes the architecture and high level design of the software."
}
The ATLAS Collaboration. Online muon reconstruction in the atlas level-2 trigger system. In Nuclear Science Symposium and Medical Imaging Conference. 2004.
Article
@inproceedings{nss-2004,
author = "Collaboration, The ATLAS",
title = "Online Muon Reconstruction in the ATLAS Level-2 trigger system",
booktitle = "Nuclear Science Symposium and Medical Imaging Conference",
year = "2004",
OPTvolume = "",
OPTnumber = "",
OPTpages = "",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nss-2004.pdf",
abstract = "To cope with the 40 MHz event production rate of LHC, the trigger of the ATLAS experiment selects the events in three sequential steps of increasing complexity and accuracy whose final results are close to the offline reconstruction. The Level-1, implemented with custom hardware, identifies physics objects within Regions of Interests and operates a first reduction of the event rate to 75 KHz. The higher trigger levels provide a software based event selection which further reduces the event rate to about 100 Hz. This paper presents the algorithm (muFast) employed at Level-2 to confirm the muon candidates flagged by the Level-1. muFast identifies hits of muon tracks inside the Muon Spectrometer and provides a precise measurement of the muon momentum at the production vertex. The algorithm must process the Level-1 muon output rate (\textasciitilde 20 KHz), thus a particular care has been used for its optimization. The result is a very fast track reconstruction algorithm with good physics performances which, in some cases, approach those of the offline reconstruction: it computes the pT of prompt muons with a resolution of 5.5\\% at 6 GeV and 4.0\\% at 20 GeV and with an efficiency of about 95\\%. The algorithm requires an overall execution time of \textasciitilde 1 ms on a 100 SpecInts95 machine."
}
2003
A. Anjos and J.M. Seixas. Neural particle discrimination for triggering interesting physics channels with calorimetry data. Nuclear Instruments And Methods In Physics Research A - Accelerators, Spectrometers, Detectors And Associated Equipament, 502:713–715, August 2003. doi:10.1016/S0168-9002(03)00553-9.
Article
@article{nima-2003,
author = "Anjos, A. and Seixas, J.M.",
title = "Neural particle discrimination for triggering interesting physics channels with calorimetry data",
journal = "Nuclear Instruments And Methods In Physics Research A - Accelerators, Spectrometers, Detectors And Associated Equipament",
year = "2003",
month = "August",
volume = "502",
pages = "713--715",
doi = "10.1016/S0168-9002(03)00553-9",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/nima-2003.pdf",
abstract = "This article introduces a triggering scheme for high input rate processors, based on neural networks. The technique is applied to the Electron/Jet discrimination problem, present at the second level trigger of the ATLAS experiment, being constructed at CERN. The proposed solution outperforms the scheme adopted nowadays at CERN, both in discrimination efficiency and performance, becoming a candidate algorithm for implementation at the experiment."
}
The ATLAS Collaboration. An overview of algorithms for the atlas high level trigger. IEEE Transactions on Nuclear Science, 51(3 (2004)):367–374, June 2003.
Article
@article{ieee-tns-2004,
author = "Collaboration, The ATLAS",
title = "An Overview of Algorithms for the ATLAS High Level Trigger",
journal = "IEEE Transactions on Nuclear Science",
year = "2003",
month = "June",
volume = "51",
number = "3 (2004)",
pages = "367--374",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-2004.pdf",
abstract = "Following rigorous software design and analysis methods, an object-based architecture has been developed to derive the second- and third-level trigger decisions for the future ATLAS detector at the LHC. The functional components within this system responsible for generating elements of the trigger decisions are algorithms running within the software architecture. Relevant aspects of the architecture are reviewed along with concrete examples of specific algorithms."
}
The ATLAS Collaboration. Experience with multi-threaded c++ applications in the atlas dataflow software. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article
@inproceedings{chep-2003-1,
author = "Collaboration, The ATLAS",
title = "Experience with multi-threaded C++ applications in the ATLAS dataflow software",
booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2003-1.pdf",
abstract = "The DataFlow is sub-system of the ATLAS data acquisition responsible for the reception, buffering and subsequent movement of partial and full event data to the higher level triggers: Level 2 and Event Filter. The design of the software is based on OO methodology and its implementation relies heavily on the use of posix threads and the Standard Template Library. This article presents our experience with Linux, posix threads and the Standard Template Library in the real time environment of the ATLAS data flow."
}
The ATLAS Collaboration. The algorithm steering and trigger decision mechanism of the atlas high level trigger. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article
@inproceedings{chep-2003-2,
author = "Collaboration, The ATLAS",
title = "The Algorithm Steering and Trigger Decision mechanism of the ATLAS High Level Trigger",
booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2003-2.pdf",
abstract = "Given the extremely high output rate foreseen at LHC and the general-purpose nature of ATLAS experiment, an efficient and flexible way to select events in the High Level Trigger is needed. An extremely flexible solution is proposed that allows for early rejection of unwanted events and an easily configurable way to choose algorithms and to specify the criteria for trigger decisions. It is implemented in the standard ATLAS object-oriented software framework, Athena. The early rejection is achieved by breaking the decision process down into sequential steps. The configuration of each step defines sequences of algorithms which should be used to process the data, and \'trigger menus\' that define which physics signatures must be satisfied to continue on to the next step, and ultimately to accept the event. A navigation system has been built on top of the standard Athena transient store (StoreGate) to link the event data together in a tree-like structure. This is fundamental to the seeding mechanism, by which data from one step is presented to the next. The design makes it straightforward to utilize existing off-line reconstruction data classes and algorithms when they are suitable"
}
The ATLAS Collaboration. A new implementation of the region-of-interest strategy for the atlas second level trigger. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article
@inproceedings{chep-2003-3,
author = "Collaboration, The ATLAS",
title = "A New Implementation of the Region-of-Interest Strategy for the ATLAS Second Level Trigger",
booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2003-3.pdf",
abstract = "Among the many challenges presented by the future ATLAS detector at the LHC are the high data taking rate and volume and the derivation of a rapid trigger decision with limited resources. To address this challenge within the ATLAS second level trigger system, a Region-of-Interest mechanism has been adopted which dramatically reduces the relevant fiducial volume necessary to be readout and processed to small regions guided by the hardware-based first level trigger. Software ha s been developed to allow fast translation between arbitrary geometric regions and identifiers of small collections of the event data. This facilitates on-demand data retrieval and collection building. The system is optimized to minimize the amount of data transferred and unnecessary building of complex objects. Detail s of the design and implementation are presented along with preliminary performance results."
}
The ATLAS Collaboration. The dataflow system of the atlas trigger and daq. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article
@inproceedings{chep-2003-4,
author = "Collaboration, The ATLAS",
title = "The DataFlow System of the ATLAS Trigger and DAQ",
booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/chep-2003-4.pdf",
abstract = "The baseline design and implementation of the DataFlow system, to be documented in the ATLAS DAQ/HLT Technical Design Report in summer 2003, will be presented. Empahsis will be placed on the system performance and scalability based on the results from prototyping studies which have maximised the use of commercially available hardware."
}
The ATLAS Collaboration. The atlas hlt, daq and dcs technical design report. Technical Report, CERN Publication, 2003.
Article
@techreport{cern-tdaq-tdr-2003,
author = "Collaboration, The ATLAS",
title = "The ATLAS HLT, DAQ and DCS Technical Design Report",
institution = "CERN Publication",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/cern-tdaq-tdr-2003.pdf",
abstract = "This document contains the \'blue-print\' specifications of the Trigger/DAQ systems of ATLAS."
}
The ATLAS Collaboration. The baseline dataflow system of the atlas trigger and daq. In 9th Workshop on Electronics for LHC Experiments. 2003.
Article
@inproceedings{elhc-2003,
author = "Collaboration, The ATLAS",
title = "The baseline dataflow system of the ATLAS trigger and DAQ",
booktitle = "9th Workshop on Electronics for LHC Experiments",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/elhc-2003.pdf",
abstract = "In this paper the baseline design of the ATLAS High Level Trigger and Data Acquisition system with respect to the DataFlow aspects, as presented in the recently submitted ATLAS Trigger/DAQ/Controls Technical Design Report [1], is reviewed and recent results of testbed measurements and from modelling are discussed."
}
The ATLAS Collaboration. Architecture of the atlas online physics-selection software at lhc. In Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications. 2003.
Article
@inproceedings{astro-2003,
author = "Collaboration, The ATLAS",
title = "Architecture of the ATLAS online physics-selection software at LHC",
booktitle = "Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications",
year = "2003",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/astro-2003.pdf",
abstract = "A filtragem de eventos no experimento ATLAS é organizada em dois níveis distintos: o Segundo Nível de Filtragem e e Filtro de Eventos. Um enfoque unificado para selecionar eventos em ambos os níveis foi escolhido. Desta forma, um conjunto de rotinas de base foi projetada para maximizar o compartilhamento das interfaces e componentes offline, ainda que mantendo uma flexibilidade suficientemente grande para atender aos requisitos operacionais do Sistema de Filtragem, notavelmente aqueles relacionados ao desempenho e robustez. Este artigo descreve a arquitetura e o projeto do sistema de seleção de eventos e mostra como esta implementação é compatível com os desafios do experimento."
}
2001
André Anjos. Sistema neuronal rápido de decisão baseado em calorimetria de altas energias. PhD thesis, COPPE/UFRJ, 2001.
Article
@phdthesis{msc-thesis-2001,
author = "Anjos, André",
title = "Sistema neuronal rápido de decisão baseado em calorimetria de altas energias",
school = "COPPE/UFRJ",
year = "2001",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/msc-thesis-2001.pdf",
abstract = "This work (written in portuguese) develops a fast neural classifier for high energy particle discrimination (electron/jet) at the second level trigger of ATLAS, at CERN, Switzerland. The classifier is fed by the information from one of the ATLAS detectors, the calorimeter, a highly segmented detector which measures the energy of particles with high resolution. The information is preprocessed in a clever way, by building concentric energy ring sums, which reduces significantly the input dimensionality. Despite the high information compactation rate, the designed system achieves a very high discrimination efficiency (97\\% for electrons and 95,1\\% for jets), outperforming the classical solution implemented nowadays at the second level trigger. A system implementation on a fast digital signal processor (DSP) is presented, and its performance is evaluated in both speed and accuracy."
}
A. Anjos and J.M. Seixas. Redes neurais especialistas para a separação elétron-jato usando calorímetros multi-camadas e multi-segmentados. In XXII Encontro Nacional de Física de Partículas e Campos. 2001.
Article
@inproceedings{enfpc-2001,
author = "Anjos, A. and Seixas, J.M.",
title = "Redes Neurais especialistas para a separação Elétron-Jato usando Calorímetros multi-camadas e multi-segmentados",
booktitle = "XXII Encontro Nacional de Física de Partículas e Campos",
year = "2001",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/enfpc-2001.pdf",
abstract = "O experimento ATLAS estará operacional no ano de 2006. O objetivo principal deste experimento é a deteção do bóson de Higgs, usando entre outros tipos de detetores, calorímetros. Um dos canais de deteção mais importantes no experimento é o de elétrons com alta energia transversa, representando de 30 a 40\\% do total das assinaturas a serem analisadas pelo Sistema de Filtragem. Jatos (de partículas) confundem-se comumente com elétrons pela forma que interagem com os calorímetros. Neste trabalho, apresentamos um sistema de discriminação elétron-jato baseado em redes neurais especialistas, utilizando os dados dos calorímetros do ATLAS. Este sistema, depois de treinado, compacta o espaço de variáveis de entrada (células dos calorímetros) em um subespaço que mantém os aspectos necessários para uma deteção eficiente de elétrons. Os resultados apresentados se mostram melhores que os resultados obtidos usando-se técnicas desenvolvidas no CERN, com o mesmo objetivo."
}
2000
André Rabello dos Anjos and José Manoel de Seixas. Mapeamento em anéis para uma separação neuronal elétron-jato usando calorímetros multi-camadas e multi-segmentados. In XIX Encontro Nacional de Física de Partículas e Campos. 2000.
Article
@inproceedings{enfpc-2000,
author = "dos Anjos, André Rabello and de Seixas, José Manoel",
title = "Mapeamento em anéis para uma separação neuronal elétron-jato usando calorímetros multi-camadas e multi-segmentados",
booktitle = "XIX Encontro Nacional de Física de Partículas e Campos",
year = "2000",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/enfpc-2000.pdf",
abstract = "Propõe-se neste trabalho, que a análise conduzida no Segundo Nível de Filtragem do Experimento ATLAS, no CERN, seja feita por meio de processamento neural sobre a região de interesse previamente destacada pelo Primeiro Nível nos Calorímetros. Por depender do posicionamento da RoI no detetor, os números de camadas, granularides e profundidades das células do calorímetro são desconhecidos até o momento da chegada do evento ao sistema de análise. Ainda assim, estima-se que o número de células para análise estará em torno de 1000 por RoI. As eficiências de separação obtidas, tempos de execução e uma comparação com a eficiência de outros métodos empregados para a mesma atividade são discutidas."
}
1999
André Rabello dos Anjos and José Manoel de Seixas. Integrando plataformas e algoritmos para o segundo nível de trigger do experimento atlas. In Encontro Nacional de Física de Partículas e Campos. 1999.
Article
@inproceedings{enfpc-1999,
author = "dos Anjos, André Rabello and de Seixas, José Manoel",
title = "Integrando Plataformas e Algoritmos para o Segundo Nível de Trigger do Experimento ATLAS",
booktitle = "Encontro Nacional de Física de Partículas e Campos",
year = "1999",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/enfpc-1999.pdf",
abstract = "Este artigo contém um sumário do trabalho realizado no âmbito dos estudos de portabilidade da infraestrutura do fluxo de dados do Sistema de Filtragem, originalmente escritos em C e operando em Sistemas Operacionais comerciais para uma implementação orientada a objetos baseada em C++ rodando sobre Linux. Ele discute as vantagens deste enfoque, tanto em termos de mantenibilidade quanto do custo final de projeto."
}
1998
J.M. Seixas, L.P. Caloba, A.R. Anjos, B. Kastrup, A.C.H. Dantas, and R. Linhares. A neural online triggering system based on parallel processing. IEEE Transactions on Nuclear Science, 45(4):1814–1818, August 1998.
Article
@article{ieee-tns-1998,
author = "Seixas, J.M. and Caloba, L.P. and Anjos, A.R. and Kastrup, B. and Dantas, A.C.H. and Linhares, R.",
title = "A neural online triggering system based on parallel processing",
journal = "IEEE Transactions on Nuclear Science",
year = "1998",
month = "August",
volume = "45",
number = "4",
pages = "1814--1818",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/ieee-tns-1998.pdf",
abstract = "The study of a prototype of the second-level triggering system for operation at LHC conditions is addressed by means of a parallel machine implementation. The 16 node transputer based machine uses a fast digital signal processor acting as a coprocessor for optimizing signal processing applications. A C-language development environment is used for running all applications at ultimate speed. The implementation is based on information supplied by four detectors and includes two phases of system operation: feature extraction and global decision. Feature extraction for calorimeters and global decision processing are performed by means of neural networks. Preprocessing and neural network parameters rest in memory and the activation function is implemented using a look up table. Simulated data for the second-level trigger operation are used for performance evaluation."
}
J. M. Seixas, A. R. Anjos, C. B. Prado, L. P. Calôba, A. C. H. Dantas, and J. C. R. Aguiar. Neural classifiers implemented in a transputer based parallel machine. In International Meeting on Vector and Parallel Processing (VECPAR). 1998.
Article
@inproceedings{vecpar-1998,
author = "Seixas, J. M. and Anjos, A. R. and Prado, C. B. and Calôba, L. P. and Dantas, A. C. H. and Aguiar, J. C. R.",
title = "Neural classifiers implemented in a transputer based parallel machine",
booktitle = "International Meeting on Vector and Parallel Processing (VECPAR)",
year = "1998",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/vecpar-1998.pdf",
abstract = "A transputer based parallel machine is used as a development platform for fast neural signal processing applications in physics and electricity. The 16 node machine houses 32-bit floating point digital signal processors running as coprocessor for the transputers, so that signal processing can be optimized. The application in physics consists in a prototype of an online validation system for a high event rate collider experiment, which is implemented using neural networks for physics process identification. In electricity, a nonintrusive load monitoring system for household appliances is developed using a neural discriminator to identify seven groups of equipment."
}
André Rabello dos Anjos, Augusto Dantas, and José Manoel de Seixas. Um protótipo do sistema de validação do nível 2 para as condições do lhc. In Encontro Nacional de Física de Partículas e Campos, 32–33. 1998.
Article
@inproceedings{enfpc-1998,
author = "dos Anjos, André Rabello and Dantas, Augusto and de Seixas, José Manoel",
title = "Um Protótipo do Sistema de Validação do Nível 2 para as Condições do LHC",
booktitle = "Encontro Nacional de Física de Partículas e Campos",
year = "1998",
pages = "32--33",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/2009/01/09/sbf98.pdf",
abstract = "O experimento ATLAS pretende comprovar a existência do bóson de Higgs. Para tal, um grande sistema de deteção e aquisição vem sendo projetado. O sistema aquisição tem a função de separar em tempo real interações originárias do decaimento de um Higgs de física ordinária. A filtragem de eventos no sistema de aquisição é concebida em três níveis, de complexidade crescente e velocidade decrescente. O segundo nível pretende utilizar redes de computadores pessoais (PC\'s) interconectados por rápidos sistemas de rede. A escolha de fabricantes, sistemas operacionais e algoritmos de processamento ainda não foi feita, mas esforços em prol desta decisão vêm sendo realizados. Neste trabalho desenvolve-se uma fração do filtro de segundo nível utilizando-se de processamento paralelo, redes neurais artificiais e DSP\'s."
}
1997
J.M. Seixas, L.P. Calôba, A.R. Anjos, A.C.H. Dantas, and R. Linhares. Fast neural decision system based on dsps and parallel processing. In International Conference on Signal Processing Applications and Technologies, San Diego, USA, 1629–1633. 1997.
Article
@inproceedings{icspat-1997,
author = "Seixas, J.M. and Calôba, L.P. and Anjos, A.R. and Dantas, A.C.H. and Linhares, R.",
title = "Fast Neural Decision System Based On DSPs And Parallel Processing",
booktitle = "International Conference on Signal Processing Applications and Technologies, San Diego, USA",
year = "1997",
pages = "1629--1633",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/icspat-1997.pdf",
abstract = "A prototype of an online event validation system is developed for application in a high-energy collider experiment. The system mainly uses neural networks for extracting rate events with physics significance from a huge background noise. It is based on processing the information collected from different detectors placed around the collision point. Combining a feature extraction phase for each detector with a global decision phase for final decision on discarding or not a given event, the system acts on events previously selected by a first-level analysis that reduces the event rate to 100 kHz. To cope with this input frequency, the proposed system is being emulated in a 16 node transputer base parallel machine that has a fast digital signal processor running as a co-procesor for each node."
}
André Rabello dos Anjos. Sistema de classificação baseado em uma máquina com sistema distribuído. PhD thesis, Departamento de Eletronica/UFRJ, 1997.
Article
@phdthesis{grad-thesis-1997,
author = "dos Anjos, André Rabello",
title = "Sistema de classificação baseado em uma máquina com sistema distribuído",
school = "Departamento de Eletronica/UFRJ",
year = "1997",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/grad-thesis-1997.pdf",
abstract = "Na busca de novos canais físicos em experimentos com partículas colididas, sistemas de validação têm se mostrado de grande valia. Normalmente os subprodutos de colisões interparticulares representam física ordinária e conhecida enquanto que nova física aparece camuflada neste meio. A impossibilidade de gravação e análise do imenso volume de dados produzido nestes ambientes exige o uso de sistemas de validação para que se maximize o espaço de gravação e se minimize o espaço de procura de novos fenômenos. Em particular, no CERN, o par acelerador/colisionador do LHC, que estará operacional no ano de 2005, utilizará um destes sistemas de validação baseado em 3 níveis em cascata de complexidade crescente e velocidade decrescente. Este sistema tem por objetivo a análise e filtragem, em tempo real, de um volume de dados cuja taxa chega à impressionante faixa de 100.000.000 por segundo. A divisão do sistema em 3 etapas distintas visa produzir um sistema de validação o mais eficiente e dinâmico possível, sem que se sobrecarregue nenhuma das partes. Para o primeiro nível estima-se a utilização de processadores velozes, com nível baixo de programação, capazes de suportar a taxa inicial dos eventos. Para o terceiro nível o uso de pesado ambiente computacional é previsto. No segundo nível ambientes altamente programáveis serão combinados com técnicas de paralelização de aplicações para que atinjamos a taxa de processamento requerida de 100.000 eventos por segundo. Vários tipos de tecnologia estão sendo testadas em todo o mundo para que se decida, não somente sobre a arquitetura, mas, também, sobre o tipo de equipamento a ser empregado neste extenso sistema de classificação. Este trabalho é sobre a implementação em uma máquina com processamento distribuído de uma das arquiteturas previstas para o segundo nível de validação (ou classificação) do experimento ATLAS/LHC. A máquina em questão é um sistema Telmat TN310 com processamento distribuído por 16 nós padrão HTRAM totalmente conectados através de uma rede de chaves assíncronas. A arquitetura mencionada prevê a utilização de técnicas de paralelismo de dados e fluxo na obtenção de menores tempos de processamento. O objetivo final é entender se o processamento em sistemas semelhantes a uma TN310 (visamos o tipo de nó-de-processamento e o padrão de conexão entre estes) pode ser viável para o segundo nível de validação. Isto se dará através da análise e capacidade de abstração proporcionadas pelo desenvolvimento da aplicação sugerida no equipamento. Soma-se ao trabalho o desenvolvimento de uma unidade de decisões globais baseado em redes neurais. A unidade constitui processo central do sistema de validação. Resultados atingidos são expostos e discussões sobre técnicas de implementação são realizadas no decorrer da documentação."
}
1996
J.M. Seixas, L.P. Calôba, and A.R. Anjos. Particle discrimination using sub-optimal filtering techniques. In Congresso Brasileiro de Automatica (CBA), São Paulo, Brasil, 635–640. 1996.
Article
@inproceedings{cba-1996,
author = "Seixas, J.M. and Calôba, L.P. and Anjos, A.R.",
title = "Particle discrimination using sub-optimal filtering techniques",
booktitle = "Congresso Brasileiro de Automatica (CBA), São Paulo, Brasil",
year = "1996",
pages = "635--640",
pdf = "https://www.idiap.ch/\textasciitilde aanjos/papers/cba-1996.pdf",
abstract = "The discrimination of high energy electrons and pions using a scintillating fiber calorimeter is addressed. The discrimination method is based on analyzing the time structure of calorimeter signals and achieves discrimination response smaller than 100 ns. Signals pass through a high performance constant fraction discriminator and events that lie in the confusion region of this discriminator are analyzed through a sub-optimal filltering technique based on pulse integration. The composed discrimination system achieves 98\\% electron eficiency with less than 0.1\\% of pions being misclassified as electrons."
}