Wrist-worn fitness monitors or wearables have become widely adopted by consumers and are currently gaining increased attention by researchers for their potential contribution to digital health, specifically as it relates psychology, psychiatry, and medicine. These devices contain a multitude of sensors including photoplethysmography, which is an optical heart rate (HR) sensor. While a number of studies have examined the accuracy of a variety of wearables in controlled laboratory environments (e.g., treadmill, stationary bike; see Shcherbina et al., 2017; Wallen et al., 2016; Wang et al., 2017), no studies to the author’s knowledge have tested the accuracy HR sensors on these devices as they are used out in the real world during a 48-hour ecologically valid paradigm that approximates actual device use conditions. The objective of this study was to determine the HR accuracy of two of the most popular wearables (The Apple Watch 3 and Fitbit Charge 2) as compared to the gold-standard electrocardiogram (ECG) in a real-world setting with the exploratory goal of assessing Apple Watch 3’s heart rate variability accuracy as compared to ECG.
This study (n = 1) will look at how the Apple Watch 3 and Fitbit Charge 2 compare to an ECG across 48-hours in an ecological setting of daily life.