This skill provides patterns for working with the data-layer module. Use when creating/editing files in src/data-layer/, src/lib/data/, or adding new data sources.
src/data-layer/
├── fetchers/ # Fetch functions (one per data source)
├── index.ts # Public API - typed getter functions
├── tasks.ts # KEYS constant + Trigger.dev scheduled tasks
├── storage.ts # get/set abstraction (Netlify Blobs or mock files)
├── s3.ts # S3 image upload utility for external images
├── mocks/ # Mock data files for local development
└── .env.example # Environment variables for data-layer/Trigger.dev
src/lib/data/
└── index.ts # Next.js caching adapter (createCachedGetter)
The data-layer uses a dedicated .env.local file at src/data-layer/.env.local, separate from the main app's root .env.local.
Copy the example file:
cp src/data-layer/.env.example src/data-layer/.env.local
Fill in the required API keys (see .env.example for all options)
Run Trigger.dev tasks locally:
pnpm trigger:dev
GITHUB_TOKEN_READ_ONLY, Sentry vars (configure in both files)Configure environment variables in your Trigger.dev project dashboard. The main app and data-layer run in separate environments.
Defines all task keys and scheduled jobs:
export const KEYS = {
ETH_PRICE: "fetch-eth-price",
L2BEAT: "fetch-l2beat",
// ...
} as const
const DAILY: Task[] = [
[KEYS.APPS, fetchApps],
[KEYS.EVENTS, fetchEvents],
]
const HOURLY: Task[] = [
[KEYS.ETH_PRICE, fetchEthPrice],
[KEYS.BEACONCHAIN, fetchBeaconChain],
]
One-liner passthrough functions:
export const getEthPrice = () => get<MetricReturnData>(KEYS.ETH_PRICE)
export const getL2beatData = () => get<L2beatData>(KEYS.L2BEAT)
Simple get/set that switches between Netlify Blobs (prod) and local JSON files (dev):
export async function get<T>(key: string): Promise<T | null>
export async function set(key: string, data: unknown): Promise<void>
Uses USE_MOCK_DATA=true env var for local development.
Centralized S3 upload for external images. Fetchers use this to upload external images to a single S3 bucket, reducing Next.js remotePatterns complexity.
// Upload single image
const s3Url = await uploadToS3(sourceUrl, "events/logos")
// Batch upload (parallel)
const s3Urls = await uploadManyToS3(urls, "apps/banners")
Key features:
null for large imagesNo transformations in index.ts - just get<T>(KEYS.X):
// Correct
export const getEventsData = () => get<EventItem[]>(KEYS.EVENTS)
// Wrong - no transformations in getters
export const getEventsData = () => {
const data = await get<EventItem[]>(KEYS.EVENTS)
return data?.map(transform) ?? null
}
All transformations belong in the fetcher (src/data-layer/fetchers/).
All task IDs are defined in KEYS in tasks.ts. The getter in index.ts and the task tuple in DAILY/HOURLY must use the same key.
Add cached wrapper in src/lib/data/index.ts:
export const getEventsData = createCachedGetter(
dataLayer.getEventsData,
["events-data"],
CACHE_REVALIDATE_DAY // or CACHE_REVALIDATE_HOUR
)
External images should be uploaded to S3 in the fetcher to centralize image domains:
// In fetcher - correct
import { uploadToS3 } from "../s3"
const logoUrl = await uploadToS3(event.logoImage, "events/logos")
return { ...event, logoImage: logoUrl ?? "" }
Always handle null returns (upload failures) with fallback/empty string.
Create fetcher in src/data-layer/fetchers/fetchNewData.ts:
export async function fetchNewData(): Promise<YourDataType> {
// Fetch and transform data here
}
Add key to KEYS in src/data-layer/tasks.ts:
export const KEYS = {
// ...existing keys
NEW_DATA: "fetch-new-data",
} as const
Add task tuple to DAILY or HOURLY in tasks.ts:
const DAILY: Task[] = [
// ...existing tasks
[KEYS.NEW_DATA, fetchNewData],
]
Add getter in src/data-layer/index.ts:
export const getNewData = () => get<YourDataType>(KEYS.NEW_DATA)
Add mock file at src/data-layer/mocks/fetch-new-data.json for local development
Add cached wrapper in src/lib/data/index.ts:
export const getNewData = createCachedGetter(
dataLayer.getNewData,
["new-data"],
CACHE_REVALIDATE_HOUR
)