(C) PLOS One [1]. This unaltered content originally appeared in journals.plosone.org.
Licensed under Creative Commons Attribution (CC BY) license.
url:
https://journals.plos.org/plosone/s/licenses-and-copyright
------------
pyActigraphy: Open-source python package for actigraphy data visualization and analysis
['Grégory Hammad', 'Giga-Crc In Vivo Imaging', 'University Of Liège', 'Liège', 'Mathilde Reyt', 'Psychology', 'Neuroscience Of Cognition', 'Faculty Of Psychology', 'Nikita Beliy', 'Marion Baillet']
Date: 2021-11
Over the past 40 years, actigraphy has been used to study rest-activity patterns in circadian rhythm and sleep research. Furthermore, considering its simplicity of use, there is a growing interest in the analysis of large population-based samples, using actigraphy. Here, we introduce pyActigraphy, a comprehensive toolbox for data visualization and analysis including multiple sleep detection algorithms and rest-activity rhythm variables. This open-source python package implements methods to read multiple data formats, quantify various properties of rest-activity rhythms, visualize sleep agendas, automatically detect rest periods and perform more advanced signal processing analyses. The development of this package aims to pave the way towards the establishment of a comprehensive open-source software suite, supported by a community of both developers and researchers, that would provide all the necessary tools for in-depth and large scale actigraphy data analyses.
The possibility to continuously record locomotor movements using accelerometers (actigraphy) has allowed field studies of sleep and rest-activity patterns. It has also enabled large-scale data collections, opening new avenues for research. However, each brand of actigraph devices encodes recordings in its own format and closed-source proprietary softwares are typically used to read and analyse actigraphy data. In order to provide an alternative to these softwares, we developed a comprehensive open-source toolbox for actigraphy data analysis, pyActigraphy. It allows researchers to read actigraphy data from 7 different file formats and gives access to a variety of rest-activity rhythm variables, automatic sleep detection algorithms and more advanced signal processing techniques. Besides, in order to empower researchers and clinicians with respect to their analyses, we created a series of interactive tutorials that illustrate how to implement the key steps of typical actigraphy data analyses. As an open-source project, all kind of user’s contributions to our toolbox are welcome. As increasing evidence points to the predicting value of rest-activity patterns derived from actigraphy for brain integrity, we believe that the development of the pyActigraphy package will not only benefit the sleep and chronobiology research, but also the neuroscientific community at large.
Funding: The development of the pyActigraphy package is part of the CogNap project that has received funding from the European Research Council under the European Union’s Horizon 2020 research and innovation programme, Grant agreement No. 757763 (to CS). This work was also supported by the Fonds de la Recherche Scientifique - FNRS under Grant nr T.0220.20 (to CS). CS is a research associate and MD is a FRIA grantee of the Fonds de la Recherche Scientifique - FNRS, Belgium (
https://www.frs-fnrs.be/fr/ ). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Copyright: © 2021 Hammad et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Several initiatives to collect, host and share large actigraphy data sets have been successfully carried out over the past years; in 2012, the UK Biobank decided to add 7-day actimetry-derived physical activity data collection [ 6 ]. The National Sleep Research Resource [ 7 ] was launched in 2014 and it currently hosts actigraphy recordings for more than 18000 subjects. Not only these data sets were successfully used to perform genome-wide association studies, where the number of subjects is often a statistically limiting factor, and reveal links between rest-activity phenotypes and pathology of genetic background (e.g. [ 8 , 9 ]) but they could also be crucial for understanding public health issues such as the impact of daylight time saving changes or chronic sleep deprivation. However, processing and analyzing such a large number of recordings remain a challenge. Therefore, the emergence of such biobanks should be matched by the emergence of appropriate analysis tools. Besides, facilitating the access to such analysis tools for actigraphy data would benefit other fields of neuroscience. For example, there are evidence for a link between human brain structure and the locomotor activity, whether it is the total amount of activity [ 10 , 11 ], the sleep fragmentation [ 12 ] or the integrity of the circadian rhythmicity [ 3 , 13 ]. Human brain functions are also modulated by circadian and/or seasonal rhythmicity [ 14 , 15 ]. Therefore, a precise assessment of rhythmicity, as allowed by actigraphy, is crucial for functional brain imaging and cognitive studies too. These are only a few of the many examples that emphasize the benefit of extending the use of actigraphy outside the field of sleep and circadian research.
However, the generalization of the findings made by this technique remains difficult; researchers either develop specific, often closed-source, data processing pipeline and/or analysis scripts, which are time-consuming, error prone and make the reproducibility of the analyses difficult, or they rely on commercial toolboxes that are not only costly but also act as black boxes. In addition, cumbersome manual data preprocessing, such as cleaning, hampers large scale analyses, which are mandatory for reliable and generalizable results.
Actigraphy consists in continuous movement recordings, using small watch-like accelerometers that are usually worn on the wrist or on the chest. As recordings can last several days or weeks, this technique is an adequate tool for in-situ assessments of the locomotor activity and the study of rhythmic rest-activity patterns. Consequently, it has been used in the field of sleep and circadian rhythm research [ 1 ] to assess night-to-night variability in estimated sleep parameters as well as rest-activity rhythm integrity. For example, intradaily variability has been associated with both cognitive and brain ageing [ 2 , 3 ], while sleep fragmentation, as quantified by probability transitions from rest to activity during night-time, has been linked to cognitive performances [ 4 ] as well as to increased risks for Alzheimer’s disease [ 5 ].
Design and implementation
The pyActigraphy package is written in Python 3 (Python Software Foundation,
https://www.python.org/). As illustrated in Fig 1, a dedicated class has been implemented for each file format to extract the corresponding actigraphy data, as well as the associated meta-data. These classes inherit from a base class implementing the various functionalities of the pyActigraphy package, via multiple inheritance (mixin). This centric approach provides multiple advantages; classes for new file formats can easily be implemented as they have to solely focus on reading the acquired data and meta-data. Additional functionalities will be inherited from the base class. In addition, newly added metrics or functions are readily available to all dedicated classes that derive from the base class. This design has been chosen in order to ease contributions from users with various coding skills.
Most of the variables and algorithms implemented in this package have been developed and validated for actigraphy devices that aggregate data into so-called movement counts. More recent devices provide now access to raw acceleration data and specific algorithms have been developed for this type of data (e.g [16, 17]). Nonetheless, it remains possible to convert these data into movement counts, providing a backward compatibility for these devices with algorithms validated with count-based devices [18]. This procedure has not yet been implemented in the pyActigraphy package but will be in the near future. However, data converted to counts can readily be used with our package.
Reading native actigraphy files The pyActigraphy package provides a unified way to read several actigraphy file formats. Currently, it supports output files from: wGT3X-BT, Actigraph (.agd file format only);
Actiwatch 4 and MotionWatch 8, CamNtech;
ActTrust 2, Condor Instruments;
Daqtometer, Daqtix;
Actiwatch 2 and Actiwatch Spectrum Plus, Philips Respironics;
Tempatilumi, CE Brasil. For each file format, a dedicated class has been implemented to extract the corresponding actigraphy data, as well as the associated meta-data. These classes inherit from a base class implementing the various functionalities of the pyActigraphy package. In addition, the package allows users to read actigraphy recordings, either individually, for visual inspection for example, or by batch, for analysis purposes.
Masking and cleaning data Before analysing the data, spurious periods of inactivity, where the actigraph was most likely removed by the participant, need to be discarded from the activity recordings. The pyActigraphy package implements a method to mask such periods, either manually or using timestamps specified in a text file. For convenience, it is also possible to automatically detect periods of continuous total inactivity, in order to create an initial mask that can be further visually inspected and edited by the users. Given that manual edition of masked periods might be tedious for large-scale data sets, more sophisticated methods for automatic masking [19, 20] could be implemented in the future. In addition to temporary actigraph removals, another usual source of artificial inactivities arises when the recordings start before and/or end after the actigraph is actually worn by the participant. Upon reading an actigraphy file, the pyActigraphy package allows users to discard such inactivity periods by specifying a start and a stop timestamp. The data collected outside this time range are not analyzed. These timestamps can also be specified by batch by using a simple log file where each line should correspond to the participant’s identification. This file is then processed to automatically apply such boundaries to the corresponding actigraphy file read by the package.
Activity profile and onset/offset times In circadian rhythm and sleep research, profile plots of the mean daily activity of actigraphy recording provides a visual tool to assess the overall rest-activity pattern, as well as recurrent behaviours such as naps. Patterns extracted from these profiles provide valid biomarkers that have been linked to cognitive decline [21] and psychiatric disorder [22]. Profiles are obtained by averaging consecutive data points that are 24h apart, over the consecutive days contained in the recording. The pyActigraphy package provides methods to construct these profiles (Fig 2). In addition, it provides methods to anchor the 24h-profile of an individual to a specific time and therefore ease group averaging; for example, if one uses the dim-light melatonin onset time, it becomes possible to compare activity data acquired at the same circadian phase across participants. For convenience, two methods have been implemented to detect the time points of a profile where the relative difference between the mean activity before and after this time point is maximal and minimal, respectively. These time points might then serve as initial estimates of the individual activity onset and offset times. PPT PowerPoint slide
PNG larger image
TIFF original image Download: Fig 2. Visualization example of average daily profiles obtained with pyActigraphy using example files included in the package.
https://doi.org/10.1371/journal.pcbi.1009514.g002
Visualization of sleep agenda In both sleep research and medicine, a sleep diary is usually given with an actimeter to allow participants to report sleep episodes (duration and timing) as well as the subjective assessment of sleep quality for example. It allows comparisons between data recorded by an actigraph and the subjective perception of the individual wearing the device. In medical fields, sleep diaries are commonly recommended in order to help doctors in the diagnosis and treatment of sleep-wake disorders. The pyActigraphy package allows users to visualize and analyse sleep diaries, encoded as .ods or .csv files. Each row of these files indicates a new event, characterized by a type, a start time and an end time. A summary function provides descriptive statistics (mean, std, quantiles, …) for each type of events. For convenience and considering the current interests of the researchers involved in the development of the package, four types (active, nap, night, no-wear) are implemented by default when a sleep diary is read. However, the pyActigraphy package allows users to remove or customize these types and add new ones. As shown in Fig 3, the visualization of the sleep diary is allowed through the use of the python plotting library “plotly” [23]. Each event found in the sleep diary is associated with a plotly “shape” object that can be overlaid with the actigraphy data in order to visually assess the adequacy between the subjective reports and their objective counterparts. PPT PowerPoint slide
PNG larger image
TIFF original image Download: Fig 3. Visualization example of actigraphy data, overlaid with periods (green: Nap, grey: Night, red: Device not worn) reported in the sleep diary example file included in the package.
https://doi.org/10.1371/journal.pcbi.1009514.g003
Rest-activity rhythm variables Non-parametric rest-activity variables can easily be calculated with the pyActigraphy package. The list of such variables includes: the interdaily stability (IS) and the intradaily variability (IV) [24], which quantify the day-to-day variance and the activity fragmentation, respectively;
the relative amplitude (RA) [25], which measures the relative difference between the mean activity during the 10 most active hours (M10) and the 5 least active ones (L5). In addition, pyActigraphy implements the mean IS and IV variables, namely ISm and IVm [26], obtained by averaging IS or IV values calculated with data resampled at different frequencies. Finally, the pyActigraphy package allows users to calculate the values of the IS(m), IV(m) and RA variables for consecutive, non-overlapping time periods of user-defined lengths. Upon calling the corresponding function, users can specify the resampling frequency, if the data must be binarized before calculation, as well as the threshold used to binarize the data.
Fragmentation of rest-activity patterns The pyActigraphy package implements rest-activity state transition probabilities, k RA and k AR [27]. These variables quantify the fragmentation of the rest-activity pattern fragmentation; based on a probabilistic state transition model, where epochs with no activity are associated to a “rest” state (R) and to an “active” state (A) otherwise, the k RA variable is associated with the probability to transition from a sustained “rest” state to an “active” state and the k AR variable is associated with the probability to transition from a sustained “active” state to a “rest” state. The pyActigraphy package allows users to restrict the computation of the k RA and k AR variables to specific period of the day. For example, to target sleep periods, users may specify the activity offset and onset times (see section Activity profile and onset/offset times), as derived from individual activity profiles, as time boundaries. In the case of the k RA variable, this would provide a quantification of the sleep fragmentation, adapted to a subject’s specific rest periods.
Rest-activity period detection The pyActigraphy package implements several rest-activity detection algorithms, which can be classified into two broad classes: Epoch-by-epoch rest/activity scoring algorithms: Cole-Kripke’s [28], Oakley’s [29], Sadeh’s [30] and Scripps’ [31] algorithms. The idea underlying these algorithms is to convolve the signal contained in a sliding window with a pre-defined kernel. Most algorithms use gaussian-like kernels. If the resulting value is higher than a certain threshold, then the epoch under consideration, usually the one located at the centre of the sliding window, is classified as active and as rest, otherwise. Finally, the window is shifted forward by one epoch and the classification procedure is repeated.
Detection of consolidated periods of similar activity patterns: Crespo’s [32] and Roenneberg’s [33] algorithms. These two algorithms are fundamentally different from the epoch-by-epoch scoring algorithm as they intend to detect, at once, consolidated periods of rest. One advantage of this class of algorithms is that it provides a start and a stop time for each period classified as rest. As illustrated in Fig 4, these algorithms have been implemented to return a binary time series: 0 being rest or activity depending on the definition made in the original article describing the detection algorithm. PPT PowerPoint slide
PNG larger image
TIFF original image Download: Fig 4. Visualization example of actigraphy data, overlaid with periods scored as “active” (0) or “rest” (1) by Roenneberg’s algorithm [ Visualization example of actigraphy data, overlaid with periods scored as “active” (0) or “rest” (1) by Roenneberg’s algorithm [ 33 ] for two different settings (full line: Default parameter values, dash line: With a threshold set at 0.25 of the activity trend).
https://doi.org/10.1371/journal.pcbi.1009514.g004 Based on the aforementioned algorithms, the pyActigraphy package allows also the computation of a sleep regularity profile which quantifies the probability for the participant to be in the same state (rest or active) at any daytime point on a day-by-day basis. From this 24h profile, the sleep regularity index (SRI) [34, 35] can be calculated as the product of theses probabilities over all the time bins. Finally, using the detection algorithms of the latter class, the pyActigraphy package allows the computation of the sleep midpoint as described in [35].
Advanced signal processing The pyActigraphy package makes available additional functions for more advanced analyses of actigraphy recordings: Cosinor [36]: the idea of a Cosinor analysis is to estimate some key parameters of the actigraphy count series by fitting these data with a (co)sine curve:
Detrented Fluctuation Analysis (DFA) [37, 38]: human activity exhibits a temporal organization characterised by scale-invariant (fractal) patterns over time scales ranging from minutes to 24 hours. This organization has been shown to be degraded with aging and dementia [39]. The DFA method allows the quantification of this scale-invariance and comprises four steps: Signal integration and mean subtraction Signal segmentation Local detrending of each segment Computation of the q-th order fluctuations
All these steps have been implemented in the DFA class of pyActigraphy.
All these steps have been implemented in the DFA class of pyActigraphy. Functional linear modelling (FLM) [40]: it consists in converting discrete measures to a function or a set of functions that can be used for further analysis. In most cases, the smoothness of the resulting function is under control, which ensures the derivability of this function. Three techniques are available in pyActigraphy to convert the actigraphy data to a functional form: Fourier expansion B-spline interpolation Smoothing
In the context of actigraphy, functional linear modelling and analysis have been successfully applied to link sleep apnea and obesity to specific circadian activity patterns [41].
In the context of actigraphy, functional linear modelling and analysis have been successfully applied to link sleep apnea and obesity to specific circadian activity patterns [41]. Locomotor inactivity during sleep (LIDS) [42]: the analysis of the locomotor activity during sleep revealed a rhythmicity that mimics the ultradian dynamic of sleep. This type of analysis opens new opportunities to study, in situ, sleep dynamics at a large scale and over large individual time periods. The LIDS class implements all the necessary functions to perform the analysis of the LIDS oscillations: sleep bout filtering non-linear conversion of activity to inactivity extraction of the characteristic features of the LIDS oscillations via a cosine fit
Singular spectrum analysis (SSA) [43, 44]: this technique allows the decomposition of a time series into additive components and the quantification of their respective partial variance. In the context of actigraphy, SSA can be used to extract the signal trend as well as circadian and ultradian components separately. The latter is relevant in human sleep research because sleep is not only alternating with wakefulness over the 24-hour cycle, but also exhibits an ultradian modulation, as mentioned previously. For example, a SSA analysis has been used to reveal alterations of the ultradian rhythms in insomnia [45]. All the necessary steps for the SSA and related functions, namely the embedding, the singular value decomposition, the eigentriple grouping and the diagonal averaging, are implemented in the SSA class. Since the subsequent calculations can be computationally intensive, the class implementation uses the open-source compiler Numba [46] for a direct translation of the functions to machine code and therefore improve their execution speed by several orders of magnitudes.
Online documentation and tutorials The online documentation of the pyActigraphy package (
https://ghammad.github.io/pyActigraphy) contains instructions to install the package, as well as information about the authors and the code license. It also contains a list of the attributes and methods available in the pyActigraphy package. More information about their implementation, as well as the reference to the related original research articles, can be found in the online API documentation (
https://ghammad.github.io/pyActigraphy/api.html), which is generated automatically from source code annotations. In order to keep the documentation up to date with the latest developments of the package, the documentation is automatically generated anew and made available online for each new release. Finally, the online documentation offers several tutorials (
https://ghammad.github.io/pyActigraphy/tutorials.html), illustrating the various functionalities of the package. These tutorials are generated from Jupyter notebooks [47] that are included in the pyActigraphy package itself, so that they can be used by any user to reproduce and practice the various functionalities of the pyActigraphy package in an interactive and user-friendly environment. As input data, the tutorials use real example data files that are included in the package for illustration and testing purposes. In total, 13 examples are included.
[END]
[1] Url:
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1009514
(C) Plos One. "Accelerating the publication of peer-reviewed science."
Licensed under Creative Commons Attribution (CC BY 4.0)
URL:
https://creativecommons.org/licenses/by/4.0/
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/plosone/