diff --git a/docs/assets/guides/hk-dashboard.jpeg b/docs/assets/guides/hk-dashboard.jpeg deleted file mode 100644 index a613d9dd..00000000 Binary files a/docs/assets/guides/hk-dashboard.jpeg and /dev/null differ diff --git a/docs/assets/heartkit-banner.png b/docs/assets/heartkit-banner.png index 5685748a..9f576f21 100644 Binary files a/docs/assets/heartkit-banner.png and b/docs/assets/heartkit-banner.png differ diff --git a/docs/guides/heartkit-demo.md b/docs/guides/heartkit-demo.md deleted file mode 100644 index 93b5fe2f..00000000 --- a/docs/guides/heartkit-demo.md +++ /dev/null @@ -1,111 +0,0 @@ -# :octicons-heart-fill-24:{ .heart } HeartKit: AI Heart Analysis Demo - -## Overview - -The HeartKit: AI Heart Analysis demo is a real-time, ECG-based heart analysis demonstrator that showcases several AI models trained using Ambiq's open-source HeartKit ADK. By leveraging a modern multi-head network architecture coupled with Ambiq's ultra low-power SoC, the demo is designed to be **efficient**, **explainable**, and **extensible**. - -The architecture consists of 1-lead ECG data collected in real-time from a MAX86150 sensor. The data is preprocessed using an AI based denoising model followed by an ECG segmentation model. The segmentation model is used to annotate every sample as either P-wave, QRS, T-wave, or none. The resulting ECG data and segmentation mask is then fed into upstream “heads” to perform inference. The upstream heads include a HRV head, a rhythm head, and a beat head. The HRV head is used to calculate heart rate, rhythm, and heart rate variability from the segmented QRS peaks. The rhythm head is used to detect the presence of arrhythmias including Atrial Fibrillation (AFIB) and Atrial Flutter (AFL). The beat model is used to classify individual irregular beats as either normal or ectopic. - -```mermaid -flowchart LR - COLL[1. Collect/Prepocess] --> DEN - subgraph Models - DEN[2. Denoise] --> SEG - SEG[3. Segmentation] --> HRV - SEG[3. Segmentation] --> BEAT - SEG[3. Segmentation] --> ARR - end - HRV[4. HRV] --> BLE - BEAT[4. BEAT] --> BLE - ARR[4. Rhythm] --> BLE - BLE[5. BLE] --> APP - APP[6. Tileio App] -``` - -In the first stage, 5 seconds of sensor data is collected- either from stored subject data or directly from the MAX86150 sensor. In stage 2, the ECG data is denoised and in stage 3 is segmented. In stage 4, the cleaned, segmented data is fed into the upstream HeartKit models to perform inference. Finally, in stage 5, the ECG data and metrics are streamed over BLE to Tileio App to be displayed in a dashboard. - ---- - -## Architecture - -The HeartKit demo leverages a multi-head network- a backbone denoising and segmentation model followed by 3 upstream heads: - -* [__Denoising model__](../tasks/denoise.md) utilizes a small 1-D TCN architecture to remove noise from the ECG signal. -* [__Segmentation model__](../tasks/segmentation.md) utilizes a small 1-D TCN architecture to perform ECG segmentation. -* [__Rhythm head__](../tasks/rhythm.md) utilizes a 1-D MBConv CNN to detect arrhythmias include AFIB and AFL. -* [__Beat-level head__](../tasks/beat.md) utilizes a 1-D MBConv CNN to detect irregular individual beats (PAC, PVC). -* __HRV head__ utilizes segmentation results to derive a number of useful metrics including heart rate and heart rate variability (HRV). - - - ---- - -## Demo Setup - -### Contents - -The following items are needed to run the HeartKit demo: - -* 1x Apollo4 Blue Plus EVB -* 1x MAX86150 Breakout Board -* 1x iPad or laptop w/ Chrome Browser -* 1x USB-C battery pack for EVB -* 2x USB-C cables -* 1x Qwiic cable - -!!! note - Please be sure to run the EVB from battery when using live sensor data. In addition, be sure to minimize sorrounding EMI/RFI noise as the exposed sensor board's ECG pads are highly sensitive. - -### Flash Firmware - -If using a fresh Apollo 4 EVB, the EVB will need to be flashed with the HeartKit firmware. The instructions to compile and flash the firmware can be found in [Tileio Demos](https://github.com/AmbiqAI/tileio-demos/tree/main/heartkit). - -### Hardware Setup - -In order to connect the MAX86150 breakout board to the EVB, we leverage the Qwiic connector on the breakout board. This will require a Qwiic breakout cable. For 3V3, use a jumper to connect Vext to 3V3 power rail. Then connect the cable as follows: - -| Qwiic Cable | EVB Board | -| ------------ | ----------------- | -| Power (RED) | VCC (J17 pin 1) | -| GND (BLACK) | GND (J11 pin 3) | -| SCL (YELLOW) | GPIO8 (J11 pin 3) | -| SDA (BLUE) | GPIO9 (J11 pin 1) | - - -
- ![max86150-5pin-header](../assets/guides/max86150-5pin-header.webp){ width="480" } -
MAX86150 Sensor Board
-
- ---- - -## Run Demo - -1. Connect the MAX86150 breakout board to the EVB using the Qwiic cable. - -2. Power on the EVB using the USB-C battery pack. - -3. Launch Tileio app on your iPad or go to [Tileio Web](https://ambiqai.github.io/tileio) using a Desktop Chrome browser. - -4. If this is a new device, click on the "+" icon on the top right corner to add a new device. - - 1. Scan and select device. - 2. Configure device manually or upload [device configuration file](https://github.com/AmbiqAI/tileio-demos/blob/d4a6806e404dab04eaf30db92fa2ae1d6d474c79/assets/device-configs/hk-device-config.json). - 3. Review and select "ADD" to add device. - -5. On “Devices view”, scan for devices. The device should turn opaque and say "ONLINE". - -6. Tap on the target device Tile to display the device dashboard. - -7. In the device dashboard, tap the BLE( :material-bluetooth: ) icon to connect to the device. - -8. If this is a new device, go to the "Settings" and configure the dashboard Tiles. Upload the [dashboard configuration file](https://github.com/AmbiqAI/tileio-demos/blob/d4a6806e404dab04eaf30db92fa2ae1d6d474c79/assets/dashboard-configs/hk-dashboard-config.json) - -9. After 5 seconds, live data should start streaming to the Tileio app. - -10. Use the "Input Select" to switch subject ECG input and "Noise Input" slider to inject additional noise. - -![evb-demo-plot](../assets/guides/hk-dashboard.jpeg) - - ---- diff --git a/docs/guides/index.md b/docs/guides/index.md index ffa9a681..44ba7ed2 100644 --- a/docs/guides/index.md +++ b/docs/guides/index.md @@ -17,4 +17,4 @@ This section contains guides to help with various aspects of HeartKit. The guide ## Hardware Guides - **[Run simple demo on EVB]()**: Running a demo using Ambiq SoC as backend inference engine. -- **[Full HeartKit EVB App](heartkit-demo.md)**: A guide to running a multi-headed model demo on Ambiq EVB. +- **[HeartKit Tileio Demo](https://ambiqai.github.io/tileio-docs/demos/heartkit/)**: A guide to running a multi-headed model demo on Ambiq EVB. diff --git a/docs/tasks/denoise.md b/docs/tasks/denoise.md index 6313fd0d..409b4396 100644 --- a/docs/tasks/denoise.md +++ b/docs/tasks/denoise.md @@ -44,7 +44,7 @@ Dataloaders are available for the following datasets: ## Pre-trained Models -The following table provides the latest performance and accuracy results of denoising models. Additional result details can be found in [Model Zoo → Denoise](../zoo/denoise.md). +The following table provides the latest performance and accuracy results of denoising models. Additional result details can be found in [Model Zoo → Denoise](../zoo/index.md). --8<-- "assets/zoo/denoise/denoise-model-zoo-table.md" diff --git a/docs/tasks/rhythm.md b/docs/tasks/rhythm.md index bad1153d..8d0a4e80 100644 --- a/docs/tasks/rhythm.md +++ b/docs/tasks/rhythm.md @@ -77,7 +77,7 @@ Dataloaders are available for the following datasets: ## Pre-trained Models -The following table provides the latest performance and accuracy results for rhythm models. Additional result details can be found in [Model Zoo → Rhythm](../zoo/rhythm.md). +The following table provides the latest performance and accuracy results for rhythm models. Additional result details can be found in [Model Zoo → Rhythm](../zoo/index.md). --8<-- "assets/zoo/rhythm/rhythm-model-zoo-table.md" diff --git a/docs/tasks/segmentation.md b/docs/tasks/segmentation.md index 74f94a19..12b6fff5 100644 --- a/docs/tasks/segmentation.md +++ b/docs/tasks/segmentation.md @@ -56,7 +56,7 @@ Dataloaders are available for the following datasets: ## Pre-Trained Models -The following table provides the latest performance and accuracy results for segmentation models. Additional result details can be found in [Model Zoo → Segmentation](../zoo/segmentation.md). +The following table provides the latest performance and accuracy results for segmentation models. Additional result details can be found in [Model Zoo → Segmentation](../zoo/index.md). --8<-- "assets/zoo/segmentation/segmentation-model-zoo-table.md" diff --git a/docs/zoo/index.md b/docs/zoo/index.md index 59736c8a..d7f00eb0 100644 --- a/docs/zoo/index.md +++ b/docs/zoo/index.md @@ -13,7 +13,7 @@ The following table provides the latest performance and accuracy results for den | __DEN-PPG-TCN-SM__ | Synthetic | 100Hz | 2.5s | TCN | 3.5K | 1.1M | 92.1% COS | -## [Signal Segmentation Task](./tasks/segmentation.md) +## [Signal Segmentation Task](../tasks/segmentation.md) The following table provides the latest performance and accuracy results for ECG segmentation models. @@ -25,7 +25,7 @@ The following table provides the latest performance and accuracy results for ECG | __SEG-PPG-2-TCN-SM__ | Synthetic | 100Hz | 2.5s | 2 | TCN | 4K | 1.43M | 98.6% F1 | -## [Rhythm Classification Task](./tasks/rhythm.md) +## [Rhythm Classification Task](../tasks/rhythm.md) The following table provides the latest performance and accuracy results for rhythm classification models. @@ -35,7 +35,7 @@ The following table provides the latest performance and accuracy results for rhy | __ARR-4-EFF-SM__ | LSAD | 100Hz | 5s | 4 | EfficientNetV2 | 27K | 1.6M | 95.9% F1 | -## [Beat Classification Task](./tasks/beat.md) +## [Beat Classification Task](../tasks/beat.md) The following table provides the latest performance and accuracy results for beat classification models. diff --git a/heartkit/datasets/lsad.py b/heartkit/datasets/lsad.py index 4476772f..67518cd2 100644 --- a/heartkit/datasets/lsad.py +++ b/heartkit/datasets/lsad.py @@ -481,7 +481,7 @@ def filter_patients_for_labels( patients_labels = self.get_patients_labels(patient_ids, label_map, label_type) # Find any patient with empty list label_mask = np.array([len(x) > 0 for x in patients_labels]) - neg_mask = label_mask == -1 + neg_mask = ~label_mask num_neg = neg_mask.sum() if num_neg > 0: logger.debug(f"Removed {num_neg} of {patient_ids.size} patients w/ no target class") diff --git a/heartkit/datasets/ptbxl.py b/heartkit/datasets/ptbxl.py index 72be0cbe..68d67649 100644 --- a/heartkit/datasets/ptbxl.py +++ b/heartkit/datasets/ptbxl.py @@ -565,7 +565,7 @@ def filter_patients_for_labels( patients_labels = self.get_patients_labels(patient_ids, label_map, label_type) # Find any patient with empty list label_mask = np.array([len(x) > 0 for x in patients_labels]) - neg_mask = label_mask == -1 + neg_mask = ~label_mask num_neg = neg_mask.sum() if num_neg > 0: logger.debug(f"Removed {num_neg} of {patient_ids.size} patients w/ no target class") diff --git a/heartkit/tasks/denoise/datasets.py b/heartkit/tasks/denoise/datasets.py index 896ed4dd..faa4b8f6 100644 --- a/heartkit/tasks/denoise/datasets.py +++ b/heartkit/tasks/denoise/datasets.py @@ -43,8 +43,8 @@ def create_data_pipeline( drop_remainder=True, num_parallel_calls=tf.data.AUTOTUNE, ) - ds = ds.map(lambda x: preprocessor(x), num_parallel_calls=tf.data.AUTOTUNE) ds = ds.map(lambda x: (augmenter(x), x), num_parallel_calls=tf.data.AUTOTUNE) + ds = ds.map(lambda x, y: (preprocessor(x), preprocessor(y)), num_parallel_calls=tf.data.AUTOTUNE) return ds.prefetch(tf.data.AUTOTUNE) diff --git a/mkdocs.yml b/mkdocs.yml index 0d33f60b..598641ef 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -101,7 +101,7 @@ nav: - Hardware Guides: - EVB Setup: guides/evb-setup.md - Rhythm Demo: guides/rhythm-demo.md - - HeartKit Demo: guides/heartkit-demo.md + - HeartKit Tileio Demo →: https://ambiqai.github.io/tileio-docs/demos/heartkit/ - API: api/