PatientPulse is pre-wired to India's Digital Health stack and key SaaS providers. Each integration is toggled via environment variables — no custom code required for standard configurations.
India's National Digital Health Infrastructure — ABHA identity, PHR push/pull, HFR/HPR registry sync
PatientPulse ships an ABDM connector that handles the full integration lifecycle. In development, set ABDM_CONNECTOR_MODE=stub for mock responses. Flip to live for production.
.env file. Use stub mode for development.GET /interop/abdm/identity-links/{link_request_id}ABDM_AUTO_PUSH_ON_CONSULT=true to push FHIR bundles on every completed consultAdd a new route file backend/app/routes/webhooks_abdm.py to PatientPulse
that handles incoming ABDM notifications:
- POST /webhooks/abdm/consent-artefact — patient consented to data share
- POST /webhooks/abdm/health-data-push — ABDM pushing health data to us
- POST /webhooks/abdm/link-confirm — ABHA link confirmed by patient
Each handler should:
1. Validate X-ABDM-Signature header (HMAC-SHA256 with ABDM_WEBHOOK_SECRET env var)
2. Parse the payload into Pydantic models
3. Emit a Postgres LISTEN/NOTIFY event for downstream processing
4. Return HTTP 200 {"status": "received"}
Register in main_domains/ under the interop chunk.HL7 FHIR R4 patient export, clinical bundles, and adapter contracts for third-party system integration
PatientPulse maps its internal domain models to HL7 FHIR R4 resources. Use the GET /interop/fhir/r4/patients/{patient_id} endpoint to export any patient as a standards-compliant Bundle.
curl -X GET https://api.patientpulse.in/interop/fhir/r4/patients/pat_abc123 \ -H "Authorization: Bearer <your_token>" \ -H "Accept: application/fhir+json"
In the PatientPulse FHIR adapter (find the file handling GET /interop/fhir/r4/patients/{id}),
extend the Bundle to include DiagnosticReport resources for each lab test result
associated with the patient.
Each DiagnosticReport should include:
- resourceType: "DiagnosticReport"
- status: from the PatientPulse lab result status
- code: LOINC code where available, fallback to local code
- subject: Reference to Patient
- result: array of Observation references (one per marker)
- issued: timestamp of the lab result
Map PatientPulse lab result fields to the FHIR schema.
Add unit tests for the new mapper function.Submit insurance claims, poll real-time status, and handle pre-authorization via India's unified claims gateway
PatientPulse integrates with NHCX for end-to-end insurance claim submission. Claims go through the workbench (GET /insurance/claims/{id}/workbench), are submitted via the connector, and status is polled using the correlation ID.
POST /insurance/claims with claim type preauthGET /insurance/payer-rules returns applicable policy rulesGET /insurance/nhcx/claim-status/{correlation_id} for real-time statusPatient payments, clinic subscription billing, GST invoicing, refunds, and webhook verification
When creating a Razorpay subscription for developer API access, pass the following notes fields so the webhook can identify the org and promote the plan automatically:
{
"product": "developer_api",
"org_id": "<your PatientPulse org ID>",
"plan": "growth" // or "enterprise"
}
On subscription.activated / subscription.charged: PatientPulse upserts organization_subscriptions, promotes your plan tier, and auto-issues a pp_live_sk_… production key emailed to your billing_email — no manual key creation needed.
Patient OTP delivery, appointment reminders, and incoming WhatsApp message handling
Twilio is also used in the PulseVault Node.js backend for the same OTP flow. Rate limits apply: configurable via AUTH_OTP_PHONE_LIMIT_PER_15_MIN.
Wearable and fitness data ingestion — steps, heart rate, sleep, and chronic condition markers
PulseVault integrates with Android Health Connect to pull wearable data into a patient's profile. Data is synced to the family profile and surfaces in the AI Insights screen as trending markers.
PatientPulse backend receives Health Connect payloads via POST /health-connect/sync (Feature 4). The server stores them as structured markers linked to the patient's profile.
In the PulseVault Android app (apps/patient-android or the Vault repo), add a background sync worker using WorkManager to periodically sync Health Connect data to the PatientPulse backend. Requirements: - Create HealthConnectSyncWorker : CoroutineWorker in feature/vault or core/data - Read the following Health Connect record types: StepsRecord, HeartRateRecord, SleepSessionRecord, BloodGlucoseRecord, BloodPressureRecord - Transform into the payload expected by POST /v1/health-connect/sync - Schedule with PeriodicWorkRequest: every 6 hours, requires network - Handle permission denial gracefully — show HealthConnectPermissionBanner composable if permissions not granted - Inject via Hilt; bind in the Application class
Enterprise SSO for clinic staff login, and Google OAuth for patient self-service
SAML 2.0 SSO uses python3-saml. Initiate via POST /auth/sso/start and discover providers at GET /auth/sso/discovery. Per-org SSO config is stored in organization settings.
Cloud LLM via OpenAI GPT-4, or on-premise via Ollama for privacy-sensitive deployments
Set AI_NO_RAW_PATIENT_DATA=true to strip all identifiers before sending prompts to the LLM. Use AI_LOCAL_MODEL_URL with Ollama for fully on-premise deployments that never leave your network.
In the PatientPulse backend, find the AI service module (the file handling POST /ai/interpret-lab-report and POST /ai/ambient-scribe). Refactor the LLM client to support Anthropic Claude as an alternative to OpenAI: - Add ANTHROPIC_API_KEY and ANTHROPIC_MODEL env vars (default: claude-sonnet-4-6) - Add AI_PROVIDER env var: openai (default) | anthropic | local - Create an AbstractLLMClient base class with a .complete(prompt: str) -> str method - Implement OpenAIClient, AnthropicClient, and OllamaClient subclasses - Add prompt caching headers for Anthropic (cache-control: ephemeral on system prompts) to reduce cost on repeated clinical prompts - Wire via factory function get_llm_client() that reads AI_PROVIDER from env - No changes to the route handlers — only the service layer changes
Configure an HTTPS endpoint in the Developer Portal and PatientPulse will POST real-time event payloads to it.
| Event | Trigger | Key Payload Fields |
|---|---|---|
| appointment.created | New appointment booked | appointment_id, patient_id, doctor_id, slot_time |
| appointment.cancelled | Appointment cancelled by patient or staff | appointment_id, reason, cancelled_by |
| consult.completed | Doctor marks consultation complete | visit_id, patient_id, doctor_id, diagnosis_codes[] |
| claim.submitted | Insurance claim submitted to payer | claim_id, payer_id, amount, nhcx_ref |
| claim.approved | Payer approves the claim | claim_id, approved_amount, settlement_date |
| patient.registered | New patient record created in org | patient_id, abha_linked, org_id |
| ai.interpretation.ready | Async AI lab interpretation complete | job_id, patient_id, report_url, markers[] |
| fhir.bundle.pushed | FHIR bundle pushed to ABDM health locker | bundle_id, patient_id, abha_address, resource_count |
Every outbound webhook POST has this top-level shape:
{
"id": "evt_01JXYZ...", // unique event ID (idempotency key)
"event": "appointment.created",
"org_id": 42,
"timestamp": "2026-04-30T10:15:00Z",
"api_version": "2.0",
"data": {
// event-specific fields
}
}
Each request includes an X-PP-Signature header — HMAC-SHA256 of the raw request body signed with your webhook secret. Always verify before processing:
import hashlib, hmac def verify_signature(body: bytes, header: str, secret: str) -> bool: expected = hmac.new( secret.encode(), body, hashlib.sha256 ).hexdigest() return hmac.compare_digest(expected, header)
Respond with HTTP 200 within 5 seconds. PatientPulse retries with exponential back-off (3 attempts) if it receives a non-2xx or times out. Use a queue internally if processing is slow.