How to create a pipeline where i take each Instagram post and extract its insights at 24 hour and 15 days and then at day 30.
file
File ·
instagram-insights-pipeline/
├── index.js ← Entry point (Express server + optional inline cron)
├── package.json
├── schema.sql ← DB schema (posts, insight_jobs, insights tables)
├── .env.example ← Copy to .env and fill in your tokens
├── README.md
└── src/
├── webhook.js ← Receives Instagram webhook events
├── scheduler.js ← Registers 3 jobs per new post
├── runner.js ← Cron job: fetches due insights from IG API
├── instagram.js ← Graph API client (no external HTTP libs)
└── db.js ← SQLite adapter (swap for your own store)
| Stage | File | What it does |
|---|---|---|
| Detection | webhook.js | Meta sends a media event → extract media_id |
| Registration | scheduler.js | registerPost() → insert post + 3 jobs (24h / 15d / 30d) |
| Execution | runner.js | Every 5 min → SELECT jobs WHERE run_at <= NOW() → fetch API → save |
| API | instagram.js | Calls /{media_id}/insights with correct metrics per media type |
| Storage | db.js | SQLite adapter — swap out for your existing DB |
# 1. Install
npm install
# 2. Configure
cp .env.example .env
# Fill in: IG_ACCESS_TOKEN, IG_ACCOUNT_ID, WEBHOOK_VERIFY_TOKEN
# 3. Start (server + embedded cron every 5 min)
npm run start:cron
media field → set callback to https://your-domain.com/webhook/instagramregisterPost() twice for the same post ID is safe (no duplicate jobs)published_at, not from when you discovered the postdb.js exposes a clean interface; replace with Postgres/MySQL/your ORM without touching any other fileinstagram.js handles this automatically