Nhost Functions (Serverless)
Deploy serverless functions alongside your backend:
# Enable serverless functions
FUNCTIONS_ENABLED=true
# Functions will be available at:
# https://functions.local.nself.org
NestJS Run Service
For constantly running microservices (similar to Nhost Run):
# Enable NestJS Run service
NESTJS_RUN_ENABLED=true
# Service will be available at:
# https://run.local.nself.org
The NestJS template includes:
- GraphQL resolver examples
- REST API endpoints
- Authentication integration
- Database access patterns
- WebSocket support
- Background job processing
Nhost Dashboard
Manage your backend with the official Nhost Dashboard:
# Enable Dashboard
DASHBOARD_ENABLED=true
# Dashboard will be available at:
# https://dashboard.local.nself.org
The dashboard provides:
- Database browser and editor
- GraphQL API explorer
- User management
- Storage file browser
- Logs and monitoring
Backend Services Framework
Enable additional backend services for complex applications. All services share the same Docker network as the core backend:
# Enable services directory
SERVICES_ENABLED=true
# NestJS Services (TypeScript/JavaScript)
NESTJS_ENABLED=true
NESTJS_USE_TYPESCRIPT=true
NESTJS_SERVICES=actions-api,webhook-handler
NESTJS_PORT_START=4000
# BullMQ Queue Workers (requires Redis)
BULLMQ_ENABLED=true
BULLMQ_WORKERS=email-worker,notification-worker
BULLMQ_DASHBOARD_PORT=3200
# GoLang Services
GOLANG_ENABLED=true
GOLANG_SERVICES=currency-fetcher,account-monitor
GOLANG_PORT_START=5000
# Python/FastAPI Services
PYTHON_ENABLED=true
PYTHON_SERVICES=data-analyzer,ml-predictor
PYTHON_PORT_START=6000
This creates a structured /services
directory with:
- NestJS: Perfect for Hasura actions and webhooks
- BullMQ: Queue workers for background jobs (requires Redis)
- GoLang: High-performance microservices
- Python: ML/AI and data processing services
Key Features:
- Unified Networking: All services share the same network (can communicate via service names)
- Single Command:
nself up
starts everything (core + services)
- Health Checks: Built-in health monitoring for all services
- Shared Environment: Common configuration across all services
- Auto Port Assignment: Services get sequential ports automatically
Example inter-service communication:
// From NestJS service
const hasuraUrl = 'http://hasura:8080/v1/graphql'; // Direct access
const redisHost = 'redis'; // Direct access
const postgresUrl = process.env.DATABASE_URL; // Shared database
Hello World Example
The hello world example demonstrates all services working together as "database fillers":
# Enable all services and Redis
SERVICES_ENABLED=true
REDIS_ENABLED=true
NESTJS_ENABLED=true
NESTJS_SERVICES=weather-actions
BULLMQ_ENABLED=true
BULLMQ_WORKERS=weather-processor,currency-processor
GOLANG_ENABLED=true
GOLANG_SERVICES=currency-fetcher
PYTHON_ENABLED=true
PYTHON_SERVICES=data-analyzer
Data Flow Architecture:
- NestJS Weather Actions - Exposes Hasura actions for weather data
- BullMQ Workers - Process weather/currency data in background queues
- GoLang Currency Fetcher - High-performance currency rate fetching
- Python Data Analyzer - ML predictions on collected time-series data
Database Schema (Time-Series Ready):
- Weather readings with location and timestamps
- Currency exchange rates with historical tracking
- Predictions and analysis results
- All tables use TimescaleDB hypertables for optimal performance
Example Hasura Query:
query GetWeatherAnalysis {
weather_data(order_by: {timestamp: desc}, limit: 100) {
location
temperature
humidity
timestamp
predictions {
forecast_temperature
confidence
}
}
}