Compare commits
20 Commits
feature/ac
...
7b380fa903
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7b380fa903 | ||
|
|
ac3115a5a1 | ||
|
|
99f8271003 | ||
|
|
8564b1deba | ||
|
|
7baf110235 | ||
|
|
c88b77a804 | ||
|
|
6d7d1607b2 | ||
|
|
0f47f118f7 | ||
|
|
f8d8419622 | ||
|
|
2a33e4cf57 | ||
|
|
ab87a4b621 | ||
|
|
07f49cb457 | ||
|
|
e93a7a305d | ||
|
|
a8d176b4ec | ||
|
|
30701cddfb | ||
|
|
fe7b93c7ff | ||
|
|
2d382fd1d4 | ||
|
|
b2c6979338 | ||
|
|
2417bb8313 | ||
|
|
f3e1b8f8bf |
712
DEPLOYMENT.md
712
DEPLOYMENT.md
@@ -1,322 +1,381 @@
|
|||||||
# SmoothSchedule Production Deployment Guide
|
# SmoothSchedule Production Deployment Guide
|
||||||
|
|
||||||
|
This guide covers deploying SmoothSchedule to a production server.
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
|
||||||
|
1. [Prerequisites](#prerequisites)
|
||||||
|
2. [Quick Reference](#quick-reference)
|
||||||
|
3. [Initial Server Setup](#initial-server-setup-first-time-only)
|
||||||
|
4. [Regular Deployments](#regular-deployments)
|
||||||
|
5. [Activepieces Updates](#activepieces-updates)
|
||||||
|
6. [Troubleshooting](#troubleshooting)
|
||||||
|
7. [Maintenance](#maintenance)
|
||||||
|
|
||||||
## Prerequisites
|
## Prerequisites
|
||||||
|
|
||||||
### Server Requirements
|
### Server Requirements
|
||||||
- Ubuntu/Debian Linux server
|
- Ubuntu 20.04+ or Debian 11+
|
||||||
- Minimum 2GB RAM, 20GB disk space
|
- 4GB RAM minimum (2GB works but cannot build Activepieces image)
|
||||||
- Docker and Docker Compose installed
|
- 40GB disk space
|
||||||
- Domain name pointed to server IP: `smoothschedule.com`
|
- Docker and Docker Compose v2 installed
|
||||||
- DNS configured with wildcard subdomain: `*.smoothschedule.com`
|
- Domain with wildcard DNS configured
|
||||||
|
|
||||||
|
### Local Requirements (for deployment)
|
||||||
|
- Git access to the repository
|
||||||
|
- SSH access to the production server
|
||||||
|
- Docker (for building Activepieces image)
|
||||||
|
|
||||||
### Required Accounts/Services
|
### Required Accounts/Services
|
||||||
- [x] DigitalOcean Spaces (already configured)
|
- DigitalOcean Spaces (for static/media files)
|
||||||
- Access Key: DO801P4R8QXYMY4CE8WZ
|
- Stripe (for payments)
|
||||||
- Bucket: smoothschedule
|
- Twilio (for SMS/phone features)
|
||||||
- Region: nyc3
|
- OpenAI API (optional, for Activepieces AI copilot)
|
||||||
- [ ] Email service (optional - Mailgun or SMTP)
|
|
||||||
- [ ] Sentry (optional - error tracking)
|
|
||||||
|
|
||||||
## Pre-Deployment Checklist
|
## Quick Reference
|
||||||
|
|
||||||
### 1. DigitalOcean Spaces Setup
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Create the bucket (if not already created)
|
# Regular deployment (after initial setup)
|
||||||
aws --profile do-tor1 s3 mb s3://smoothschedule
|
./deploy.sh
|
||||||
|
|
||||||
# Set bucket to public-read for static/media files
|
# Deploy with Activepieces image rebuild
|
||||||
aws --profile do-tor1 s3api put-bucket-acl \
|
./deploy.sh --deploy-ap
|
||||||
--bucket smoothschedule \
|
|
||||||
--acl public-read
|
|
||||||
|
|
||||||
# Configure CORS (for frontend uploads)
|
# Deploy specific services only
|
||||||
cat > cors.json <<EOF
|
./deploy.sh django nginx
|
||||||
{
|
|
||||||
"CORSRules": [
|
|
||||||
{
|
|
||||||
"AllowedOrigins": ["https://smoothschedule.com", "https://*.smoothschedule.com"],
|
|
||||||
"AllowedMethods": ["GET", "PUT", "POST", "DELETE", "HEAD"],
|
|
||||||
"AllowedHeaders": ["*"],
|
|
||||||
"MaxAgeSeconds": 3000
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
EOF
|
|
||||||
|
|
||||||
aws --profile do-tor1 s3api put-bucket-cors \
|
# Skip migrations (config changes only)
|
||||||
--bucket smoothschedule \
|
./deploy.sh --no-migrate
|
||||||
--cors-configuration file://cors.json
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### 2. DNS Configuration
|
## Initial Server Setup (First Time Only)
|
||||||
|
|
||||||
Configure these DNS records at your domain registrar:
|
### 1. Server Preparation
|
||||||
|
|
||||||
```
|
|
||||||
Type Name Value TTL
|
|
||||||
A smoothschedule.com YOUR_SERVER_IP 300
|
|
||||||
A *.smoothschedule.com YOUR_SERVER_IP 300
|
|
||||||
CNAME www smoothschedule.com 300
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Environment Variables Review
|
|
||||||
|
|
||||||
**Backend** (`.envs/.production/.django`):
|
|
||||||
- [x] DJANGO_SECRET_KEY - Set
|
|
||||||
- [x] DJANGO_ALLOWED_HOSTS - Set to `.smoothschedule.com`
|
|
||||||
- [x] DJANGO_AWS_ACCESS_KEY_ID - Set
|
|
||||||
- [x] DJANGO_AWS_SECRET_ACCESS_KEY - Set
|
|
||||||
- [x] DJANGO_AWS_STORAGE_BUCKET_NAME - Set to `smoothschedule`
|
|
||||||
- [x] DJANGO_AWS_S3_ENDPOINT_URL - Set to `https://nyc3.digitaloceanspaces.com`
|
|
||||||
- [x] DJANGO_AWS_S3_REGION_NAME - Set to `nyc3`
|
|
||||||
- [ ] MAILGUN_API_KEY - Optional (for email)
|
|
||||||
- [ ] MAILGUN_DOMAIN - Optional (for email)
|
|
||||||
- [ ] SENTRY_DSN - Optional (for error tracking)
|
|
||||||
|
|
||||||
**Frontend** (`.env.production`):
|
|
||||||
- [x] VITE_API_URL - Set to `https://smoothschedule.com/api`
|
|
||||||
|
|
||||||
## Deployment Steps
|
|
||||||
|
|
||||||
### Step 1: Server Preparation
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# SSH into production server
|
# SSH into production server
|
||||||
ssh poduck@smoothschedule.com
|
ssh your-user@your-server
|
||||||
|
|
||||||
# Install Docker (if not already installed)
|
# Install Docker (if not already installed)
|
||||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||||
sudo sh get-docker.sh
|
sudo sh get-docker.sh
|
||||||
sudo usermod -aG docker $USER
|
sudo usermod -aG docker $USER
|
||||||
|
|
||||||
# Install Docker Compose
|
# Logout and login again for group changes
|
||||||
sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
|
|
||||||
sudo chmod +x /usr/local/bin/docker-compose
|
|
||||||
|
|
||||||
# Logout and login again for group changes to take effect
|
|
||||||
exit
|
exit
|
||||||
ssh poduck@smoothschedule.com
|
ssh your-user@your-server
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 2: Deploy Backend (Django)
|
### 2. Clone Repository
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Create deployment directory
|
git clone https://your-repo-url ~/smoothschedule
|
||||||
mkdir -p ~/smoothschedule
|
cd ~/smoothschedule/smoothschedule
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Create Environment Files
|
||||||
|
|
||||||
|
Copy the template files and fill in your values:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir -p .envs/.production
|
||||||
|
cp .envs.example/.django .envs/.production/.django
|
||||||
|
cp .envs.example/.postgres .envs/.production/.postgres
|
||||||
|
cp .envs.example/.activepieces .envs/.production/.activepieces
|
||||||
|
```
|
||||||
|
|
||||||
|
Edit each file with your production values:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nano .envs/.production/.django
|
||||||
|
nano .envs/.production/.postgres
|
||||||
|
nano .envs/.production/.activepieces
|
||||||
|
```
|
||||||
|
|
||||||
|
**Key values to configure:**
|
||||||
|
|
||||||
|
| File | Variable | Description |
|
||||||
|
|------|----------|-------------|
|
||||||
|
| `.django` | `DJANGO_SECRET_KEY` | Generate: `openssl rand -hex 32` |
|
||||||
|
| `.django` | `DJANGO_ALLOWED_HOSTS` | `.yourdomain.com` |
|
||||||
|
| `.django` | `STRIPE_*` | Your Stripe keys (live keys for production) |
|
||||||
|
| `.django` | `TWILIO_*` | Your Twilio credentials |
|
||||||
|
| `.django` | `AWS_*` | DigitalOcean Spaces credentials |
|
||||||
|
| `.postgres` | `POSTGRES_USER` | Generate random username |
|
||||||
|
| `.postgres` | `POSTGRES_PASSWORD` | Generate: `openssl rand -hex 32` |
|
||||||
|
| `.activepieces` | `AP_JWT_SECRET` | Generate: `openssl rand -hex 32` |
|
||||||
|
| `.activepieces` | `AP_ENCRYPTION_KEY` | Generate: `openssl rand -hex 16` |
|
||||||
|
| `.activepieces` | `AP_POSTGRES_USERNAME` | Generate random username |
|
||||||
|
| `.activepieces` | `AP_POSTGRES_PASSWORD` | Generate: `openssl rand -hex 32` |
|
||||||
|
|
||||||
|
**Important:** `AP_JWT_SECRET` must be copied to `.django` as well!
|
||||||
|
|
||||||
|
### 4. DNS Configuration
|
||||||
|
|
||||||
|
Configure these DNS records:
|
||||||
|
|
||||||
|
```
|
||||||
|
Type Name Value TTL
|
||||||
|
A yourdomain.com YOUR_SERVER_IP 300
|
||||||
|
A *.yourdomain.com YOUR_SERVER_IP 300
|
||||||
|
CNAME www yourdomain.com 300
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5. Build Activepieces Image (on your local machine)
|
||||||
|
|
||||||
|
The production server typically cannot build this image (requires 4GB+ RAM):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# On your LOCAL machine, not the server
|
||||||
cd ~/smoothschedule
|
cd ~/smoothschedule
|
||||||
|
./scripts/build-activepieces.sh deploy
|
||||||
# Clone the repository (or upload files via rsync/git)
|
|
||||||
# Option A: Clone from Git
|
|
||||||
git clone <your-repo-url> .
|
|
||||||
git checkout main
|
|
||||||
|
|
||||||
# Option B: Copy from local machine
|
|
||||||
# From your local machine:
|
|
||||||
# rsync -avz --exclude 'node_modules' --exclude '.venv' --exclude '__pycache__' \
|
|
||||||
# /home/poduck/Desktop/smoothschedule2/ poduck@smoothschedule.com:~/smoothschedule/
|
|
||||||
|
|
||||||
# Navigate to backend
|
|
||||||
cd smoothschedule
|
|
||||||
|
|
||||||
# Build and start containers
|
|
||||||
docker compose -f docker-compose.production.yml build
|
|
||||||
docker compose -f docker-compose.production.yml up -d
|
|
||||||
|
|
||||||
# Wait for containers to start
|
|
||||||
sleep 10
|
|
||||||
|
|
||||||
# Check logs
|
|
||||||
docker compose -f docker-compose.production.yml logs -f
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 3: Database Initialization
|
Or manually:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Run migrations
|
cd activepieces-fork
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py migrate
|
docker build -t smoothschedule_production_activepieces .
|
||||||
|
docker save smoothschedule_production_activepieces | gzip > /tmp/ap.tar.gz
|
||||||
# Create public schema (for multi-tenancy)
|
scp /tmp/ap.tar.gz your-user@your-server:/tmp/
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py migrate_schemas --shared
|
ssh your-user@your-server 'gunzip -c /tmp/ap.tar.gz | docker load'
|
||||||
|
|
||||||
# Create superuser
|
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py createsuperuser
|
|
||||||
|
|
||||||
# Collect static files (uploads to DigitalOcean Spaces)
|
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py collectstatic --noinput
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 4: Create Initial Tenant
|
### 6. Run Initialization Script
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# On the server
|
||||||
|
cd ~/smoothschedule/smoothschedule
|
||||||
|
chmod +x scripts/init-production.sh
|
||||||
|
./scripts/init-production.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This script will:
|
||||||
|
1. Verify environment files
|
||||||
|
2. Generate any missing security keys
|
||||||
|
3. Start PostgreSQL and Redis
|
||||||
|
4. Create the Activepieces database
|
||||||
|
5. Start all services
|
||||||
|
6. Run Django migrations
|
||||||
|
7. Guide you through Activepieces platform setup
|
||||||
|
|
||||||
|
### 7. Complete Activepieces Platform Setup
|
||||||
|
|
||||||
|
After the init script completes:
|
||||||
|
|
||||||
|
1. Visit `https://automations.yourdomain.com`
|
||||||
|
2. Create an admin account (this creates the platform)
|
||||||
|
3. Get the platform ID:
|
||||||
|
```bash
|
||||||
|
docker compose -f docker-compose.production.yml exec postgres \
|
||||||
|
psql -U <ap_db_user> -d activepieces -c "SELECT id FROM platform"
|
||||||
|
```
|
||||||
|
4. Update `AP_PLATFORM_ID` in both:
|
||||||
|
- `.envs/.production/.activepieces`
|
||||||
|
- `.envs/.production/.django`
|
||||||
|
5. Restart services:
|
||||||
|
```bash
|
||||||
|
docker compose -f docker-compose.production.yml restart
|
||||||
|
```
|
||||||
|
|
||||||
|
### 8. Create First Tenant
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Access Django shell
|
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py shell
|
docker compose -f docker-compose.production.yml exec django python manage.py shell
|
||||||
|
|
||||||
# In the shell, create your first business tenant:
|
|
||||||
```
|
```
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from core.models import Business
|
from smoothschedule.identity.core.models import Tenant, Domain
|
||||||
from django.contrib.auth import get_user_model
|
|
||||||
|
|
||||||
User = get_user_model()
|
# Create tenant
|
||||||
|
tenant = Tenant.objects.create(
|
||||||
# Create a business
|
|
||||||
business = Business.objects.create(
|
|
||||||
name="Demo Business",
|
name="Demo Business",
|
||||||
subdomain="demo",
|
subdomain="demo",
|
||||||
schema_name="demo",
|
schema_name="demo"
|
||||||
)
|
)
|
||||||
|
|
||||||
# Verify it was created
|
# Create domain
|
||||||
print(f"Created business: {business.name} at {business.subdomain}.smoothschedule.com")
|
Domain.objects.create(
|
||||||
|
tenant=tenant,
|
||||||
# Create a business owner
|
domain="demo.yourdomain.com",
|
||||||
owner = User.objects.create_user(
|
is_primary=True
|
||||||
username="demo_owner",
|
|
||||||
email="owner@demo.com",
|
|
||||||
password="your_password_here",
|
|
||||||
role="owner",
|
|
||||||
business_subdomain="demo"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
print(f"Created owner: {owner.username}")
|
|
||||||
exit()
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 5: Deploy Frontend
|
### 9. Provision Activepieces Connection
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# On your local machine
|
docker compose -f docker-compose.production.yml exec django \
|
||||||
cd /home/poduck/Desktop/smoothschedule2/frontend
|
python manage.py provision_ap_connections --tenant demo
|
||||||
|
|
||||||
# Install dependencies
|
|
||||||
npm install
|
|
||||||
|
|
||||||
# Build for production
|
|
||||||
npm run build
|
|
||||||
|
|
||||||
# Upload build files to server
|
|
||||||
rsync -avz dist/ poduck@smoothschedule.com:~/smoothschedule-frontend/
|
|
||||||
|
|
||||||
# On the server, set up nginx or serve via backend
|
|
||||||
```
|
```
|
||||||
|
|
||||||
**Option A: Serve via Django (simpler)**
|
### 10. Verify Deployment
|
||||||
|
|
||||||
The Django `collectstatic` command already handles static files. For serving the frontend:
|
|
||||||
|
|
||||||
1. Copy frontend build to Django static folder
|
|
||||||
2. Django will serve it via Traefik
|
|
||||||
|
|
||||||
**Option B: Separate Nginx (recommended for production)**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install nginx
|
|
||||||
sudo apt-get update
|
|
||||||
sudo apt-get install -y nginx
|
|
||||||
|
|
||||||
# Create nginx config
|
|
||||||
sudo nano /etc/nginx/sites-available/smoothschedule
|
|
||||||
```
|
|
||||||
|
|
||||||
```nginx
|
|
||||||
server {
|
|
||||||
listen 80;
|
|
||||||
server_name smoothschedule.com *.smoothschedule.com;
|
|
||||||
|
|
||||||
# Frontend (React)
|
|
||||||
location / {
|
|
||||||
root /home/poduck/smoothschedule-frontend;
|
|
||||||
try_files $uri $uri/ /index.html;
|
|
||||||
}
|
|
||||||
|
|
||||||
# Backend API (proxy to Traefik)
|
|
||||||
location /api {
|
|
||||||
proxy_pass http://localhost:80;
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
|
||||||
}
|
|
||||||
|
|
||||||
location /admin {
|
|
||||||
proxy_pass http://localhost:80;
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Enable site
|
|
||||||
sudo ln -s /etc/nginx/sites-available/smoothschedule /etc/nginx/sites-enabled/
|
|
||||||
sudo nginx -t
|
|
||||||
sudo systemctl reload nginx
|
|
||||||
```
|
|
||||||
|
|
||||||
### Step 6: SSL/HTTPS Setup
|
|
||||||
|
|
||||||
Traefik is configured to automatically obtain Let's Encrypt SSL certificates. Ensure:
|
|
||||||
|
|
||||||
1. DNS is pointed to your server
|
|
||||||
2. Ports 80 and 443 are accessible
|
|
||||||
3. Wait for Traefik to obtain certificates (check logs)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Monitor Traefik logs
|
|
||||||
docker compose -f docker-compose.production.yml logs -f traefik
|
|
||||||
|
|
||||||
# You should see:
|
|
||||||
# "Certificate obtained for domain smoothschedule.com"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Step 7: Verify Deployment
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Check all containers are running
|
# Check all containers are running
|
||||||
docker compose -f docker-compose.production.yml ps
|
docker compose -f docker-compose.production.yml ps
|
||||||
|
|
||||||
# Should show:
|
# Test endpoints
|
||||||
# - django (running)
|
curl https://yourdomain.com/api/
|
||||||
# - postgres (running)
|
curl https://platform.yourdomain.com/
|
||||||
# - redis (running)
|
curl https://automations.yourdomain.com/api/v1/health
|
||||||
# - traefik (running)
|
|
||||||
# - celeryworker (running)
|
|
||||||
# - celerybeat (running)
|
|
||||||
# - flower (running)
|
|
||||||
|
|
||||||
# Test API endpoint
|
|
||||||
curl https://smoothschedule.com/api/
|
|
||||||
|
|
||||||
# Test admin
|
|
||||||
curl https://smoothschedule.com/admin/
|
|
||||||
|
|
||||||
# Access in browser:
|
|
||||||
# https://smoothschedule.com - Main site
|
|
||||||
# https://platform.smoothschedule.com - Platform dashboard
|
|
||||||
# https://demo.smoothschedule.com - Demo business
|
|
||||||
# https://smoothschedule.com:5555 - Flower (Celery monitoring)
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Post-Deployment
|
## Regular Deployments
|
||||||
|
|
||||||
### 1. Monitoring
|
After initial setup, deployments are simple:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# View logs
|
# From your local machine
|
||||||
docker compose -f docker-compose.production.yml logs -f
|
cd ~/smoothschedule
|
||||||
|
|
||||||
# View specific service logs
|
# Commit and push your changes
|
||||||
docker compose -f docker-compose.production.yml logs -f django
|
git add .
|
||||||
docker compose -f docker-compose.production.yml logs -f postgres
|
git commit -m "Your changes"
|
||||||
|
git push
|
||||||
|
|
||||||
# Monitor Celery tasks via Flower
|
# Deploy
|
||||||
# Access: https://smoothschedule.com:5555
|
./deploy.sh
|
||||||
# Login with credentials from .envs/.production/.django
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### 2. Backups
|
### Deployment Options
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `./deploy.sh` | Full deployment with migrations |
|
||||||
|
| `./deploy.sh --no-migrate` | Deploy without running migrations |
|
||||||
|
| `./deploy.sh --deploy-ap` | Rebuild and deploy Activepieces image |
|
||||||
|
| `./deploy.sh django` | Rebuild only Django container |
|
||||||
|
| `./deploy.sh nginx traefik` | Rebuild specific services |
|
||||||
|
|
||||||
|
### What the Deploy Script Does
|
||||||
|
|
||||||
|
1. Checks for uncommitted changes
|
||||||
|
2. Verifies changes are pushed to remote
|
||||||
|
3. (If `--deploy-ap`) Builds and transfers Activepieces image
|
||||||
|
4. SSHs to server and pulls latest code
|
||||||
|
5. Backs up and restores `.envs` directory
|
||||||
|
6. Builds Docker images
|
||||||
|
7. Starts containers
|
||||||
|
8. Sets up Activepieces database (if needed)
|
||||||
|
9. Runs Django migrations (unless `--no-migrate`)
|
||||||
|
10. Seeds platform plugins for all tenants
|
||||||
|
|
||||||
|
## Activepieces Updates
|
||||||
|
|
||||||
|
When you modify custom pieces (in `activepieces-fork/`):
|
||||||
|
|
||||||
|
1. Make your changes to piece code
|
||||||
|
2. Commit and push
|
||||||
|
3. Deploy with the image flag:
|
||||||
|
```bash
|
||||||
|
./deploy.sh --deploy-ap
|
||||||
|
```
|
||||||
|
|
||||||
|
The Activepieces container will:
|
||||||
|
1. Start with the new image
|
||||||
|
2. Run `publish-pieces.sh` to register custom pieces
|
||||||
|
3. Insert piece metadata into the database
|
||||||
|
|
||||||
|
### Custom Pieces
|
||||||
|
|
||||||
|
Custom pieces are located in:
|
||||||
|
- `activepieces-fork/packages/pieces/community/smoothschedule/` - Main SmoothSchedule piece
|
||||||
|
- `activepieces-fork/packages/pieces/community/python-code/` - Python code execution
|
||||||
|
- `activepieces-fork/packages/pieces/community/ruby-code/` - Ruby code execution
|
||||||
|
|
||||||
|
Piece metadata is registered via:
|
||||||
|
- `activepieces-fork/custom-pieces-metadata.sql` - Database registration
|
||||||
|
- `activepieces-fork/publish-pieces.sh` - Container startup script
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### View Logs
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# All services
|
||||||
|
docker compose -f docker-compose.production.yml logs -f
|
||||||
|
|
||||||
|
# Specific service
|
||||||
|
docker compose -f docker-compose.production.yml logs -f django
|
||||||
|
docker compose -f docker-compose.production.yml logs -f activepieces
|
||||||
|
docker compose -f docker-compose.production.yml logs -f traefik
|
||||||
|
```
|
||||||
|
|
||||||
|
### Restart Services
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# All services
|
||||||
|
docker compose -f docker-compose.production.yml restart
|
||||||
|
|
||||||
|
# Specific service
|
||||||
|
docker compose -f docker-compose.production.yml restart django
|
||||||
|
docker compose -f docker-compose.production.yml restart activepieces
|
||||||
|
```
|
||||||
|
|
||||||
|
### Django Shell
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose -f docker-compose.production.yml exec django python manage.py shell
|
||||||
|
```
|
||||||
|
|
||||||
|
### Database Access
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# SmoothSchedule database
|
||||||
|
docker compose -f docker-compose.production.yml exec postgres \
|
||||||
|
psql -U <postgres_user> -d smoothschedule
|
||||||
|
|
||||||
|
# Activepieces database
|
||||||
|
docker compose -f docker-compose.production.yml exec postgres \
|
||||||
|
psql -U <ap_user> -d activepieces
|
||||||
|
```
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
|
||||||
|
**1. Activepieces pieces not showing up**
|
||||||
|
```bash
|
||||||
|
# Check if platform exists
|
||||||
|
docker compose -f docker-compose.production.yml exec postgres \
|
||||||
|
psql -U <ap_user> -d activepieces -c "SELECT id FROM platform"
|
||||||
|
|
||||||
|
# Restart to re-run piece registration
|
||||||
|
docker compose -f docker-compose.production.yml restart activepieces
|
||||||
|
|
||||||
|
# Check logs for errors
|
||||||
|
docker compose -f docker-compose.production.yml logs activepieces | grep -i error
|
||||||
|
```
|
||||||
|
|
||||||
|
**2. 502 Bad Gateway**
|
||||||
|
- Service is still starting, wait a moment
|
||||||
|
- Check container health: `docker compose ps`
|
||||||
|
- Check logs for errors
|
||||||
|
|
||||||
|
**3. Database connection errors**
|
||||||
|
- Verify credentials in `.envs/.production/`
|
||||||
|
- Ensure PostgreSQL is running: `docker compose ps postgres`
|
||||||
|
|
||||||
|
**4. Activepieces embedding not working**
|
||||||
|
- Verify `AP_JWT_SECRET` matches in both `.django` and `.activepieces`
|
||||||
|
- Verify `AP_PLATFORM_ID` is set correctly in both files
|
||||||
|
- Check `AP_EMBEDDING_ENABLED=true` in `.activepieces`
|
||||||
|
|
||||||
|
**5. SSL certificate issues**
|
||||||
|
```bash
|
||||||
|
# Check Traefik logs
|
||||||
|
docker compose -f docker-compose.production.yml logs traefik
|
||||||
|
|
||||||
|
# Verify DNS is pointing to server
|
||||||
|
dig yourdomain.com +short
|
||||||
|
|
||||||
|
# Ensure ports 80 and 443 are open
|
||||||
|
sudo ufw allow 80
|
||||||
|
sudo ufw allow 443
|
||||||
|
```
|
||||||
|
|
||||||
|
## Maintenance
|
||||||
|
|
||||||
|
### Backups
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Database backup
|
# Database backup
|
||||||
@@ -329,121 +388,50 @@ docker compose -f docker-compose.production.yml exec postgres backups
|
|||||||
docker compose -f docker-compose.production.yml exec postgres restore backup_filename.sql.gz
|
docker compose -f docker-compose.production.yml exec postgres restore backup_filename.sql.gz
|
||||||
```
|
```
|
||||||
|
|
||||||
### 3. Updates
|
### Monitoring
|
||||||
|
|
||||||
```bash
|
- **Flower Dashboard**: `https://yourdomain.com:5555` - Celery task monitoring
|
||||||
# Pull latest code
|
- **Container Status**: `docker compose ps`
|
||||||
cd ~/smoothschedule/smoothschedule
|
- **Resource Usage**: `docker stats`
|
||||||
git pull origin main
|
|
||||||
|
|
||||||
# Rebuild and restart
|
### Security Checklist
|
||||||
docker compose -f docker-compose.production.yml build
|
|
||||||
docker compose -f docker-compose.production.yml up -d
|
|
||||||
|
|
||||||
# Run migrations
|
- [x] SSL/HTTPS enabled via Let's Encrypt (automatic with Traefik)
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py migrate
|
- [x] All secret keys are unique random values
|
||||||
|
- [x] Database passwords are strong
|
||||||
# Collect static files
|
- [x] Flower dashboard is password protected
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py collectstatic --noinput
|
- [ ] Firewall configured (UFW)
|
||||||
```
|
- [ ] SSH key-based authentication only
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### SSL Certificate Issues
|
|
||||||
```bash
|
|
||||||
# Check Traefik logs
|
|
||||||
docker compose -f docker-compose.production.yml logs traefik
|
|
||||||
|
|
||||||
# Verify DNS is pointing to server
|
|
||||||
dig smoothschedule.com +short
|
|
||||||
|
|
||||||
# Ensure ports are open
|
|
||||||
sudo ufw allow 80
|
|
||||||
sudo ufw allow 443
|
|
||||||
```
|
|
||||||
|
|
||||||
### Database Connection Issues
|
|
||||||
```bash
|
|
||||||
# Check PostgreSQL is running
|
|
||||||
docker compose -f docker-compose.production.yml ps postgres
|
|
||||||
|
|
||||||
# Check database logs
|
|
||||||
docker compose -f docker-compose.production.yml logs postgres
|
|
||||||
|
|
||||||
# Verify connection
|
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py dbshell
|
|
||||||
```
|
|
||||||
|
|
||||||
### Static Files Not Loading
|
|
||||||
```bash
|
|
||||||
# Verify DigitalOcean Spaces credentials
|
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py shell
|
|
||||||
>>> from django.conf import settings
|
|
||||||
>>> print(settings.AWS_ACCESS_KEY_ID)
|
|
||||||
>>> print(settings.AWS_STORAGE_BUCKET_NAME)
|
|
||||||
|
|
||||||
# Re-collect static files
|
|
||||||
docker compose -f docker-compose.production.yml exec django python manage.py collectstatic --noinput
|
|
||||||
|
|
||||||
# Check Spaces bucket
|
|
||||||
aws --profile do-tor1 s3 ls s3://smoothschedule/static/
|
|
||||||
aws --profile do-tor1 s3 ls s3://smoothschedule/media/
|
|
||||||
```
|
|
||||||
|
|
||||||
### Celery Not Running Tasks
|
|
||||||
```bash
|
|
||||||
# Check Celery worker logs
|
|
||||||
docker compose -f docker-compose.production.yml logs celeryworker
|
|
||||||
|
|
||||||
# Access Flower dashboard
|
|
||||||
# https://smoothschedule.com:5555
|
|
||||||
|
|
||||||
# Restart Celery
|
|
||||||
docker compose -f docker-compose.production.yml restart celeryworker celerybeat
|
|
||||||
```
|
|
||||||
|
|
||||||
## Security Checklist
|
|
||||||
|
|
||||||
- [x] SSL/HTTPS enabled via Let's Encrypt
|
|
||||||
- [x] DJANGO_SECRET_KEY set to random value
|
|
||||||
- [x] Database password set to random value
|
|
||||||
- [x] Flower dashboard password protected
|
|
||||||
- [ ] Firewall configured (UFW or iptables)
|
|
||||||
- [ ] SSH key-based authentication enabled
|
|
||||||
- [ ] Fail2ban installed for brute-force protection
|
|
||||||
- [ ] Regular backups configured
|
- [ ] Regular backups configured
|
||||||
- [ ] Sentry error monitoring (optional)
|
- [ ] Monitoring/alerting set up
|
||||||
|
|
||||||
## Performance Optimization
|
## File Structure
|
||||||
|
|
||||||
1. **Enable CDN for DigitalOcean Spaces**
|
```
|
||||||
- In Spaces settings, enable CDN
|
smoothschedule/
|
||||||
- Update `DJANGO_AWS_S3_CUSTOM_DOMAIN=smoothschedule.nyc3.cdn.digitaloceanspaces.com`
|
├── deploy.sh # Main deployment script
|
||||||
|
├── DEPLOYMENT.md # This file
|
||||||
2. **Scale Gunicorn Workers**
|
├── scripts/
|
||||||
- Adjust `WEB_CONCURRENCY` in `.envs/.production/.django`
|
│ └── build-activepieces.sh # Activepieces image builder
|
||||||
- Formula: (2 x CPU cores) + 1
|
├── smoothschedule/
|
||||||
|
│ ├── docker-compose.production.yml
|
||||||
3. **Add Redis Persistence**
|
│ ├── scripts/
|
||||||
- Update docker-compose.production.yml redis config
|
│ │ └── init-production.sh # One-time initialization
|
||||||
- Enable AOF persistence
|
│ ├── .envs/
|
||||||
|
│ │ └── .production/ # Production secrets (NOT in git)
|
||||||
4. **Database Connection Pooling**
|
│ │ ├── .django
|
||||||
- Already configured via `CONN_MAX_AGE=60`
|
│ │ ├── .postgres
|
||||||
|
│ │ └── .activepieces
|
||||||
## Maintenance
|
│ └── .envs.example/ # Template files (in git)
|
||||||
|
│ ├── .django
|
||||||
### Weekly
|
│ ├── .postgres
|
||||||
- Review error logs
|
│ └── .activepieces
|
||||||
- Check disk space: `df -h`
|
└── activepieces-fork/
|
||||||
- Monitor Flower dashboard for failed tasks
|
├── Dockerfile
|
||||||
|
├── custom-pieces-metadata.sql
|
||||||
### Monthly
|
├── publish-pieces.sh
|
||||||
- Update Docker images: `docker compose pull`
|
└── packages/pieces/community/
|
||||||
- Update dependencies: `uv sync`
|
├── smoothschedule/ # Main custom piece
|
||||||
- Review backups
|
├── python-code/
|
||||||
|
└── ruby-code/
|
||||||
### As Needed
|
```
|
||||||
- Scale resources (CPU/RAM)
|
|
||||||
- Add more Celery workers
|
|
||||||
- Optimize database queries
|
|
||||||
|
|||||||
5
activepieces-fork/.gitignore
vendored
Normal file
5
activepieces-fork/.gitignore
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
node_modules/
|
||||||
|
.nx/cache/
|
||||||
|
.nx/workspace-data/
|
||||||
|
dist/
|
||||||
|
package-lock.json
|
||||||
@@ -1 +1 @@
|
|||||||
1766103708902
|
1766280110308
|
||||||
@@ -1,11 +1,15 @@
|
|||||||
FROM node:20.19-bullseye-slim AS base
|
FROM node:20.19-bullseye-slim AS base
|
||||||
|
|
||||||
# Set environment variables early for better layer caching
|
# Set environment variables early for better layer caching
|
||||||
|
# Memory optimizations for low-RAM servers (2GB):
|
||||||
|
# - Limit Node.js heap to 1536MB to leave room for system
|
||||||
|
# - Disable NX daemon and cloud to reduce overhead
|
||||||
ENV LANG=en_US.UTF-8 \
|
ENV LANG=en_US.UTF-8 \
|
||||||
LANGUAGE=en_US:en \
|
LANGUAGE=en_US:en \
|
||||||
LC_ALL=en_US.UTF-8 \
|
LC_ALL=en_US.UTF-8 \
|
||||||
NX_DAEMON=false \
|
NX_DAEMON=false \
|
||||||
NX_NO_CLOUD=true
|
NX_NO_CLOUD=true \
|
||||||
|
NODE_OPTIONS="--max-old-space-size=1536"
|
||||||
|
|
||||||
# Install all system dependencies in a single layer with cache mounts
|
# Install all system dependencies in a single layer with cache mounts
|
||||||
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
||||||
@@ -14,6 +18,8 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
|||||||
apt-get install -y --no-install-recommends \
|
apt-get install -y --no-install-recommends \
|
||||||
openssh-client \
|
openssh-client \
|
||||||
python3 \
|
python3 \
|
||||||
|
python3-pip \
|
||||||
|
ruby \
|
||||||
g++ \
|
g++ \
|
||||||
build-essential \
|
build-essential \
|
||||||
git \
|
git \
|
||||||
@@ -28,17 +34,8 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
|||||||
libcap-dev && \
|
libcap-dev && \
|
||||||
yarn config set python /usr/bin/python3
|
yarn config set python /usr/bin/python3
|
||||||
|
|
||||||
RUN export ARCH=$(uname -m) && \
|
# Install Bun using npm (more reliable than GitHub downloads)
|
||||||
if [ "$ARCH" = "x86_64" ]; then \
|
RUN npm install -g bun@1.3.1
|
||||||
curl -fSL https://github.com/oven-sh/bun/releases/download/bun-v1.3.1/bun-linux-x64-baseline.zip -o bun.zip; \
|
|
||||||
elif [ "$ARCH" = "aarch64" ]; then \
|
|
||||||
curl -fSL https://github.com/oven-sh/bun/releases/download/bun-v1.3.1/bun-linux-aarch64.zip -o bun.zip; \
|
|
||||||
fi
|
|
||||||
|
|
||||||
RUN unzip bun.zip \
|
|
||||||
&& mv bun-*/bun /usr/local/bin/bun \
|
|
||||||
&& chmod +x /usr/local/bin/bun \
|
|
||||||
&& rm -rf bun.zip bun-*
|
|
||||||
|
|
||||||
RUN bun --version
|
RUN bun --version
|
||||||
|
|
||||||
@@ -62,24 +59,30 @@ WORKDIR /usr/src/app
|
|||||||
# Copy only dependency files first for better layer caching
|
# Copy only dependency files first for better layer caching
|
||||||
COPY .npmrc package.json bun.lock ./
|
COPY .npmrc package.json bun.lock ./
|
||||||
|
|
||||||
# Install all dependencies with frozen lockfile
|
# Install all dependencies
|
||||||
RUN --mount=type=cache,target=/root/.bun/install/cache \
|
RUN --mount=type=cache,target=/root/.bun/install/cache \
|
||||||
bun install --frozen-lockfile
|
bun install
|
||||||
|
|
||||||
# Copy source code after dependency installation
|
# Copy source code after dependency installation
|
||||||
COPY . .
|
COPY . .
|
||||||
|
|
||||||
# Build all projects including the SmoothSchedule piece
|
# Build all projects including custom pieces
|
||||||
RUN npx nx run-many --target=build --projects=react-ui,server-api,pieces-smoothschedule --configuration production --parallel=2 --skip-nx-cache
|
RUN npx nx run-many --target=build --projects=react-ui,server-api,pieces-smoothschedule,pieces-python-code,pieces-ruby-code,pieces-interfaces --configuration production --parallel=2 --skip-nx-cache
|
||||||
|
|
||||||
# Install production dependencies only for the backend API
|
# Install production dependencies only for the backend API
|
||||||
RUN --mount=type=cache,target=/root/.bun/install/cache \
|
RUN --mount=type=cache,target=/root/.bun/install/cache \
|
||||||
cd dist/packages/server/api && \
|
cd dist/packages/server/api && \
|
||||||
bun install --production --frozen-lockfile
|
bun install --production --frozen-lockfile
|
||||||
|
|
||||||
# Install dependencies for the SmoothSchedule piece
|
# Install dependencies for custom pieces
|
||||||
RUN --mount=type=cache,target=/root/.bun/install/cache \
|
RUN --mount=type=cache,target=/root/.bun/install/cache \
|
||||||
cd dist/packages/pieces/community/smoothschedule && \
|
cd dist/packages/pieces/community/smoothschedule && \
|
||||||
|
bun install --production && \
|
||||||
|
cd ../python-code && \
|
||||||
|
bun install --production && \
|
||||||
|
cd ../ruby-code && \
|
||||||
|
bun install --production && \
|
||||||
|
cd ../interfaces && \
|
||||||
bun install --production
|
bun install --production
|
||||||
|
|
||||||
### STAGE 2: Run ###
|
### STAGE 2: Run ###
|
||||||
@@ -87,24 +90,30 @@ FROM base AS run
|
|||||||
|
|
||||||
WORKDIR /usr/src/app
|
WORKDIR /usr/src/app
|
||||||
|
|
||||||
# Install Nginx and gettext in a single layer with cache mount
|
# Install Nginx, gettext, and PostgreSQL client in a single layer with cache mount
|
||||||
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
||||||
--mount=type=cache,target=/var/lib/apt,sharing=locked \
|
--mount=type=cache,target=/var/lib/apt,sharing=locked \
|
||||||
apt-get update && \
|
apt-get update && \
|
||||||
apt-get install -y --no-install-recommends nginx gettext
|
apt-get install -y --no-install-recommends nginx gettext postgresql-client
|
||||||
|
|
||||||
# Copy static configuration files first (better layer caching)
|
# Copy static configuration files first (better layer caching)
|
||||||
COPY nginx.react.conf /etc/nginx/nginx.conf
|
COPY nginx.react.conf /etc/nginx/nginx.conf
|
||||||
COPY --from=build /usr/src/app/packages/server/api/src/assets/default.cf /usr/local/etc/isolate
|
COPY --from=build /usr/src/app/packages/server/api/src/assets/default.cf /usr/local/etc/isolate
|
||||||
COPY docker-entrypoint.sh .
|
COPY docker-entrypoint.sh .
|
||||||
|
COPY custom-pieces-metadata.sql .
|
||||||
|
COPY publish-pieces.sh .
|
||||||
|
|
||||||
# Create all necessary directories in one layer
|
# Create all necessary directories in one layer
|
||||||
|
# Also create symlink for AP_DEV_PIECES to find pieces in dist folder
|
||||||
|
# Structure: /packages/pieces/community -> /dist/packages/pieces/community
|
||||||
RUN mkdir -p \
|
RUN mkdir -p \
|
||||||
/usr/src/app/dist/packages/server \
|
/usr/src/app/dist/packages/server \
|
||||||
/usr/src/app/dist/packages/engine \
|
/usr/src/app/dist/packages/engine \
|
||||||
/usr/src/app/dist/packages/shared \
|
/usr/src/app/dist/packages/shared \
|
||||||
/usr/src/app/dist/packages/pieces && \
|
/usr/src/app/dist/packages/pieces \
|
||||||
chmod +x docker-entrypoint.sh
|
/usr/src/app/packages/pieces && \
|
||||||
|
ln -sf /usr/src/app/dist/packages/pieces/community /usr/src/app/packages/pieces/community && \
|
||||||
|
chmod +x docker-entrypoint.sh publish-pieces.sh
|
||||||
|
|
||||||
# Copy built artifacts from build stage
|
# Copy built artifacts from build stage
|
||||||
COPY --from=build /usr/src/app/LICENSE .
|
COPY --from=build /usr/src/app/LICENSE .
|
||||||
@@ -112,7 +121,8 @@ COPY --from=build /usr/src/app/dist/packages/engine/ ./dist/packages/engine/
|
|||||||
COPY --from=build /usr/src/app/dist/packages/server/ ./dist/packages/server/
|
COPY --from=build /usr/src/app/dist/packages/server/ ./dist/packages/server/
|
||||||
COPY --from=build /usr/src/app/dist/packages/shared/ ./dist/packages/shared/
|
COPY --from=build /usr/src/app/dist/packages/shared/ ./dist/packages/shared/
|
||||||
COPY --from=build /usr/src/app/dist/packages/pieces/ ./dist/packages/pieces/
|
COPY --from=build /usr/src/app/dist/packages/pieces/ ./dist/packages/pieces/
|
||||||
COPY --from=build /usr/src/app/packages ./packages
|
# Note: Don't copy /packages folder - it triggers dev piece auto-detection
|
||||||
|
# The pre-built pieces in dist/packages/pieces/ are sufficient for production
|
||||||
|
|
||||||
# Copy frontend files to Nginx document root
|
# Copy frontend files to Nginx document root
|
||||||
COPY --from=build /usr/src/app/dist/packages/react-ui /usr/share/nginx/html/
|
COPY --from=build /usr/src/app/dist/packages/react-ui /usr/share/nginx/html/
|
||||||
|
|||||||
57
activepieces-fork/custom-pieces-metadata.sql
Normal file
57
activepieces-fork/custom-pieces-metadata.sql
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
-- ==============================================================================
|
||||||
|
-- Custom SmoothSchedule Pieces Configuration
|
||||||
|
-- ==============================================================================
|
||||||
|
-- This script configures pinned pieces for the Activepieces platform.
|
||||||
|
-- It runs on container startup via docker-entrypoint.sh.
|
||||||
|
--
|
||||||
|
-- NOTE: We do NOT insert pieces into piece_metadata because they are already
|
||||||
|
-- built into the Docker image in packages/pieces/community/. Activepieces
|
||||||
|
-- auto-discovers these as OFFICIAL pieces. Adding them to piece_metadata
|
||||||
|
-- would create duplicates in the UI.
|
||||||
|
--
|
||||||
|
-- We ONLY set pinnedPieces to make our pieces appear first in Highlights.
|
||||||
|
-- ==============================================================================
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
platform_id varchar(21);
|
||||||
|
platform_count integer;
|
||||||
|
BEGIN
|
||||||
|
-- Check if platform table exists and has data
|
||||||
|
SELECT COUNT(*) INTO platform_count FROM platform;
|
||||||
|
|
||||||
|
IF platform_count = 0 THEN
|
||||||
|
RAISE NOTICE 'No platform found yet - skipping piece configuration';
|
||||||
|
RAISE NOTICE 'Pieces will be configured on next container restart after platform is created';
|
||||||
|
RETURN;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
SELECT id INTO platform_id FROM platform LIMIT 1;
|
||||||
|
RAISE NOTICE 'Configuring pieces for platform: %', platform_id;
|
||||||
|
|
||||||
|
-- Remove any duplicate CUSTOM entries for pieces that are built into the image
|
||||||
|
-- These cause duplicates in the UI since they're also discovered from filesystem
|
||||||
|
DELETE FROM piece_metadata WHERE name IN (
|
||||||
|
'@activepieces/piece-smoothschedule',
|
||||||
|
'@activepieces/piece-python-code',
|
||||||
|
'@activepieces/piece-ruby-code',
|
||||||
|
'@activepieces/piece-interfaces'
|
||||||
|
) AND "pieceType" = 'CUSTOM';
|
||||||
|
|
||||||
|
IF FOUND THEN
|
||||||
|
RAISE NOTICE 'Removed duplicate CUSTOM piece entries';
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- Pin our pieces in the platform so they appear first in Highlights
|
||||||
|
-- This works with pieces auto-discovered from the filesystem
|
||||||
|
UPDATE platform
|
||||||
|
SET "pinnedPieces" = ARRAY[
|
||||||
|
'@activepieces/piece-smoothschedule',
|
||||||
|
'@activepieces/piece-python-code',
|
||||||
|
'@activepieces/piece-ruby-code'
|
||||||
|
]::varchar[]
|
||||||
|
WHERE id = platform_id
|
||||||
|
AND ("pinnedPieces" = '{}' OR "pinnedPieces" IS NULL OR NOT '@activepieces/piece-smoothschedule' = ANY("pinnedPieces"));
|
||||||
|
|
||||||
|
RAISE NOTICE 'Piece configuration complete';
|
||||||
|
END $$;
|
||||||
@@ -12,6 +12,10 @@ echo "AP_FAVICON_URL: $AP_FAVICON_URL"
|
|||||||
envsubst '${AP_APP_TITLE} ${AP_FAVICON_URL}' < /usr/share/nginx/html/index.html > /usr/share/nginx/html/index.html.tmp && \
|
envsubst '${AP_APP_TITLE} ${AP_FAVICON_URL}' < /usr/share/nginx/html/index.html > /usr/share/nginx/html/index.html.tmp && \
|
||||||
mv /usr/share/nginx/html/index.html.tmp /usr/share/nginx/html/index.html
|
mv /usr/share/nginx/html/index.html.tmp /usr/share/nginx/html/index.html
|
||||||
|
|
||||||
|
# Register custom pieces (publish to Verdaccio and insert metadata)
|
||||||
|
if [ -f /usr/src/app/publish-pieces.sh ]; then
|
||||||
|
/usr/src/app/publish-pieces.sh || echo "Warning: Custom pieces registration had issues"
|
||||||
|
fi
|
||||||
|
|
||||||
# Start Nginx server
|
# Start Nginx server
|
||||||
nginx -g "daemon off;" &
|
nginx -g "daemon off;" &
|
||||||
|
|||||||
@@ -31,7 +31,7 @@ http {
|
|||||||
proxy_send_timeout 900s;
|
proxy_send_timeout 900s;
|
||||||
}
|
}
|
||||||
|
|
||||||
location ~* ^/(?!api/).*.(css|js|jpg|jpeg|png|gif|ico|svg)$ {
|
location ~* ^/(?!api/).*\.(css|js|jpg|jpeg|png|gif|ico|svg|woff|woff2|ttf|eot)$ {
|
||||||
root /usr/share/nginx/html;
|
root /usr/share/nginx/html;
|
||||||
add_header Expires "0";
|
add_header Expires "0";
|
||||||
add_header Cache-Control "public, max-age=31536000, immutable";
|
add_header Cache-Control "public, max-age=31536000, immutable";
|
||||||
|
|||||||
@@ -354,6 +354,7 @@
|
|||||||
"@vitest/ui": "1.6.1",
|
"@vitest/ui": "1.6.1",
|
||||||
"autoprefixer": "10.4.15",
|
"autoprefixer": "10.4.15",
|
||||||
"babel-jest": "30.0.5",
|
"babel-jest": "30.0.5",
|
||||||
|
"bun": "1.3.5",
|
||||||
"chalk": "4.1.2",
|
"chalk": "4.1.2",
|
||||||
"concurrently": "8.2.1",
|
"concurrently": "8.2.1",
|
||||||
"esbuild": "0.25.0",
|
"esbuild": "0.25.0",
|
||||||
|
|||||||
@@ -93,7 +93,7 @@ export const STANDARD_CLOUD_PLAN: PlatformPlanWithOnlyLimits = {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export const OPEN_SOURCE_PLAN: PlatformPlanWithOnlyLimits = {
|
export const OPEN_SOURCE_PLAN: PlatformPlanWithOnlyLimits = {
|
||||||
embeddingEnabled: false,
|
embeddingEnabled: true,
|
||||||
globalConnectionsEnabled: false,
|
globalConnectionsEnabled: false,
|
||||||
customRolesEnabled: false,
|
customRolesEnabled: false,
|
||||||
mcpsEnabled: true,
|
mcpsEnabled: true,
|
||||||
@@ -107,9 +107,9 @@ export const OPEN_SOURCE_PLAN: PlatformPlanWithOnlyLimits = {
|
|||||||
analyticsEnabled: true,
|
analyticsEnabled: true,
|
||||||
showPoweredBy: false,
|
showPoweredBy: false,
|
||||||
auditLogEnabled: false,
|
auditLogEnabled: false,
|
||||||
managePiecesEnabled: false,
|
managePiecesEnabled: true,
|
||||||
manageTemplatesEnabled: false,
|
manageTemplatesEnabled: true,
|
||||||
customAppearanceEnabled: false,
|
customAppearanceEnabled: true,
|
||||||
teamProjectsLimit: TeamProjectsLimit.NONE,
|
teamProjectsLimit: TeamProjectsLimit.NONE,
|
||||||
projectRolesEnabled: false,
|
projectRolesEnabled: false,
|
||||||
customDomainsEnabled: false,
|
customDomainsEnabled: false,
|
||||||
|
|||||||
@@ -0,0 +1,4 @@
|
|||||||
|
{
|
||||||
|
"name": "@activepieces/piece-interfaces",
|
||||||
|
"version": "0.0.1"
|
||||||
|
}
|
||||||
@@ -0,0 +1,50 @@
|
|||||||
|
{
|
||||||
|
"name": "pieces-interfaces",
|
||||||
|
"$schema": "../../../../node_modules/nx/schemas/project-schema.json",
|
||||||
|
"sourceRoot": "packages/pieces/community/interfaces/src",
|
||||||
|
"projectType": "library",
|
||||||
|
"targets": {
|
||||||
|
"build": {
|
||||||
|
"executor": "@nx/js:tsc",
|
||||||
|
"outputs": [
|
||||||
|
"{options.outputPath}"
|
||||||
|
],
|
||||||
|
"options": {
|
||||||
|
"outputPath": "dist/packages/pieces/community/interfaces",
|
||||||
|
"tsConfig": "packages/pieces/community/interfaces/tsconfig.lib.json",
|
||||||
|
"packageJson": "packages/pieces/community/interfaces/package.json",
|
||||||
|
"main": "packages/pieces/community/interfaces/src/index.ts",
|
||||||
|
"assets": [],
|
||||||
|
"buildableProjectDepsInPackageJsonType": "dependencies",
|
||||||
|
"updateBuildableProjectDepsInPackageJson": true
|
||||||
|
},
|
||||||
|
"dependsOn": [
|
||||||
|
"^build",
|
||||||
|
"prebuild"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"publish": {
|
||||||
|
"command": "node tools/scripts/publish.mjs pieces-interfaces {args.ver} {args.tag}",
|
||||||
|
"dependsOn": [
|
||||||
|
"build"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"lint": {
|
||||||
|
"executor": "@nx/eslint:lint",
|
||||||
|
"outputs": [
|
||||||
|
"{options.outputFile}"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"prebuild": {
|
||||||
|
"executor": "nx:run-commands",
|
||||||
|
"options": {
|
||||||
|
"cwd": "packages/pieces/community/interfaces",
|
||||||
|
"command": "bun install --no-save --silent"
|
||||||
|
},
|
||||||
|
"dependsOn": [
|
||||||
|
"^build"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"tags": []
|
||||||
|
}
|
||||||
@@ -0,0 +1,14 @@
|
|||||||
|
import { createPiece, PieceAuth } from '@activepieces/pieces-framework';
|
||||||
|
import { PieceCategory } from '@activepieces/shared';
|
||||||
|
|
||||||
|
export const interfaces = createPiece({
|
||||||
|
displayName: 'Interfaces',
|
||||||
|
description: 'Create custom forms and interfaces for your workflows.',
|
||||||
|
auth: PieceAuth.None(),
|
||||||
|
categories: [PieceCategory.CORE],
|
||||||
|
minimumSupportedRelease: '0.52.0',
|
||||||
|
logoUrl: 'https://cdn.activepieces.com/pieces/interfaces.svg',
|
||||||
|
authors: ['activepieces'],
|
||||||
|
actions: [],
|
||||||
|
triggers: [],
|
||||||
|
});
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"extends": "../../../../tsconfig.base.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"module": "commonjs",
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"strict": true,
|
||||||
|
"noImplicitOverride": true,
|
||||||
|
"noImplicitReturns": true,
|
||||||
|
"noFallthroughCasesInSwitch": true,
|
||||||
|
"noPropertyAccessFromIndexSignature": true
|
||||||
|
},
|
||||||
|
"files": [],
|
||||||
|
"include": [],
|
||||||
|
"references": [
|
||||||
|
{
|
||||||
|
"path": "./tsconfig.lib.json"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
{
|
||||||
|
"extends": "./tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"module": "commonjs",
|
||||||
|
"outDir": "../../../../dist/out-tsc",
|
||||||
|
"declaration": true,
|
||||||
|
"types": ["node"]
|
||||||
|
},
|
||||||
|
"exclude": ["jest.config.ts", "src/**/*.spec.ts", "src/**/*.test.ts"],
|
||||||
|
"include": ["src/**/*.ts"]
|
||||||
|
}
|
||||||
@@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"name": "@activepieces/piece-python-code",
|
||||||
|
"version": "0.0.1",
|
||||||
|
"dependencies": {}
|
||||||
|
}
|
||||||
@@ -0,0 +1,60 @@
|
|||||||
|
{
|
||||||
|
"name": "pieces-python-code",
|
||||||
|
"$schema": "../../../../node_modules/nx/schemas/project-schema.json",
|
||||||
|
"sourceRoot": "packages/pieces/community/python-code/src",
|
||||||
|
"projectType": "library",
|
||||||
|
"release": {
|
||||||
|
"version": {
|
||||||
|
"currentVersionResolver": "git-tag",
|
||||||
|
"preserveLocalDependencyProtocols": false,
|
||||||
|
"manifestRootsToUpdate": [
|
||||||
|
"dist/{projectRoot}"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"tags": [],
|
||||||
|
"targets": {
|
||||||
|
"build": {
|
||||||
|
"executor": "@nx/js:tsc",
|
||||||
|
"outputs": [
|
||||||
|
"{options.outputPath}"
|
||||||
|
],
|
||||||
|
"options": {
|
||||||
|
"outputPath": "dist/packages/pieces/community/python-code",
|
||||||
|
"tsConfig": "packages/pieces/community/python-code/tsconfig.lib.json",
|
||||||
|
"packageJson": "packages/pieces/community/python-code/package.json",
|
||||||
|
"main": "packages/pieces/community/python-code/src/index.ts",
|
||||||
|
"assets": [
|
||||||
|
"packages/pieces/community/python-code/*.md"
|
||||||
|
],
|
||||||
|
"buildableProjectDepsInPackageJsonType": "dependencies",
|
||||||
|
"updateBuildableProjectDepsInPackageJson": true
|
||||||
|
},
|
||||||
|
"dependsOn": [
|
||||||
|
"^build",
|
||||||
|
"prebuild"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"nx-release-publish": {
|
||||||
|
"options": {
|
||||||
|
"packageRoot": "dist/{projectRoot}"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"lint": {
|
||||||
|
"executor": "@nx/eslint:lint",
|
||||||
|
"outputs": [
|
||||||
|
"{options.outputFile}"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"prebuild": {
|
||||||
|
"executor": "nx:run-commands",
|
||||||
|
"options": {
|
||||||
|
"cwd": "packages/pieces/community/python-code",
|
||||||
|
"command": "bun install --no-save --silent"
|
||||||
|
},
|
||||||
|
"dependsOn": [
|
||||||
|
"^build"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
import { createPiece, PieceAuth } from '@activepieces/pieces-framework';
|
||||||
|
import { PieceCategory } from '@activepieces/shared';
|
||||||
|
import { runPythonCode } from './lib/run-python-code';
|
||||||
|
|
||||||
|
// Python logo - icon only (from SVGRepo)
|
||||||
|
const PYTHON_LOGO = 'https://www.svgrepo.com/show/452091/python.svg';
|
||||||
|
|
||||||
|
export const pythonCode = createPiece({
|
||||||
|
displayName: 'Python Code',
|
||||||
|
description: 'Execute Python code in your automations',
|
||||||
|
auth: PieceAuth.None(),
|
||||||
|
minimumSupportedRelease: '0.36.1',
|
||||||
|
logoUrl: PYTHON_LOGO,
|
||||||
|
categories: [PieceCategory.CORE, PieceCategory.DEVELOPER_TOOLS],
|
||||||
|
authors: ['smoothschedule'],
|
||||||
|
actions: [runPythonCode],
|
||||||
|
triggers: [],
|
||||||
|
});
|
||||||
@@ -0,0 +1,112 @@
|
|||||||
|
import { createAction, Property } from '@activepieces/pieces-framework';
|
||||||
|
import { exec } from 'child_process';
|
||||||
|
import { promisify } from 'util';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
import * as os from 'os';
|
||||||
|
|
||||||
|
const execAsync = promisify(exec);
|
||||||
|
|
||||||
|
export const runPythonCode = createAction({
|
||||||
|
name: 'run_python_code',
|
||||||
|
displayName: 'Run Python Code',
|
||||||
|
description: 'Execute Python code and return the output. Use print() to output results.',
|
||||||
|
props: {
|
||||||
|
code: Property.LongText({
|
||||||
|
displayName: 'Python Code',
|
||||||
|
description: 'The Python code to execute. Use print() to output results that will be captured.',
|
||||||
|
required: true,
|
||||||
|
defaultValue: `# Example: Process input data
|
||||||
|
import json
|
||||||
|
|
||||||
|
# Access inputs via the 'inputs' variable (parsed from JSON)
|
||||||
|
# Example: name = inputs.get('name', 'World')
|
||||||
|
|
||||||
|
# Your code here
|
||||||
|
result = "Hello from Python!"
|
||||||
|
|
||||||
|
# Print your output (will be captured as the action result)
|
||||||
|
print(result)`,
|
||||||
|
}),
|
||||||
|
inputs: Property.Object({
|
||||||
|
displayName: 'Inputs',
|
||||||
|
description: 'Input data to pass to the Python code. Available as the `inputs` variable (dict).',
|
||||||
|
required: false,
|
||||||
|
defaultValue: {},
|
||||||
|
}),
|
||||||
|
timeout: Property.Number({
|
||||||
|
displayName: 'Timeout (seconds)',
|
||||||
|
description: 'Maximum execution time in seconds',
|
||||||
|
required: false,
|
||||||
|
defaultValue: 30,
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const { code, inputs, timeout } = context.propsValue;
|
||||||
|
const timeoutMs = (timeout || 30) * 1000;
|
||||||
|
|
||||||
|
// Create a temporary file for the Python code
|
||||||
|
const tmpDir = os.tmpdir();
|
||||||
|
const scriptPath = path.join(tmpDir, `ap_python_${Date.now()}.py`);
|
||||||
|
const inputPath = path.join(tmpDir, `ap_python_input_${Date.now()}.json`);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Write inputs to a JSON file
|
||||||
|
await fs.promises.writeFile(inputPath, JSON.stringify(inputs || {}));
|
||||||
|
|
||||||
|
// Wrap the user code to load inputs
|
||||||
|
const wrappedCode = `
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Load inputs from JSON file
|
||||||
|
with open('${inputPath.replace(/\\/g, '\\\\')}', 'r') as f:
|
||||||
|
inputs = json.load(f)
|
||||||
|
|
||||||
|
# User code starts here
|
||||||
|
${code}
|
||||||
|
`;
|
||||||
|
|
||||||
|
// Write the script to a temp file
|
||||||
|
await fs.promises.writeFile(scriptPath, wrappedCode);
|
||||||
|
|
||||||
|
// Execute Python
|
||||||
|
const { stdout, stderr } = await execAsync(`python3 "${scriptPath}"`, {
|
||||||
|
timeout: timeoutMs,
|
||||||
|
maxBuffer: 10 * 1024 * 1024, // 10MB buffer
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clean up temp files
|
||||||
|
await fs.promises.unlink(scriptPath).catch(() => {});
|
||||||
|
await fs.promises.unlink(inputPath).catch(() => {});
|
||||||
|
|
||||||
|
// Try to parse output as JSON, otherwise return as string
|
||||||
|
const output = stdout.trim();
|
||||||
|
let result: unknown;
|
||||||
|
try {
|
||||||
|
result = JSON.parse(output);
|
||||||
|
} catch {
|
||||||
|
result = output;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
output: result,
|
||||||
|
stdout: stdout,
|
||||||
|
stderr: stderr || null,
|
||||||
|
};
|
||||||
|
} catch (error: unknown) {
|
||||||
|
// Clean up temp files on error
|
||||||
|
await fs.promises.unlink(scriptPath).catch(() => {});
|
||||||
|
await fs.promises.unlink(inputPath).catch(() => {});
|
||||||
|
|
||||||
|
const execError = error as { stderr?: string; message?: string; killed?: boolean };
|
||||||
|
|
||||||
|
if (execError.killed) {
|
||||||
|
throw new Error(`Python script timed out after ${timeout} seconds`);
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(`Python execution failed: ${execError.stderr || execError.message}`);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"extends": "../../../../tsconfig.base.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"module": "commonjs",
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"strict": true,
|
||||||
|
"noImplicitOverride": true,
|
||||||
|
"noImplicitReturns": true,
|
||||||
|
"noFallthroughCasesInSwitch": true,
|
||||||
|
"noPropertyAccessFromIndexSignature": true
|
||||||
|
},
|
||||||
|
"files": [],
|
||||||
|
"include": [],
|
||||||
|
"references": [
|
||||||
|
{
|
||||||
|
"path": "./tsconfig.lib.json"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
{
|
||||||
|
"extends": "./tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"module": "commonjs",
|
||||||
|
"outDir": "../../../../dist/out-tsc",
|
||||||
|
"declaration": true,
|
||||||
|
"types": ["node"]
|
||||||
|
},
|
||||||
|
"exclude": ["jest.config.ts", "src/**/*.spec.ts", "src/**/*.test.ts"],
|
||||||
|
"include": ["src/**/*.ts"]
|
||||||
|
}
|
||||||
@@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"name": "@activepieces/piece-ruby-code",
|
||||||
|
"version": "0.0.1",
|
||||||
|
"dependencies": {}
|
||||||
|
}
|
||||||
@@ -0,0 +1,60 @@
|
|||||||
|
{
|
||||||
|
"name": "pieces-ruby-code",
|
||||||
|
"$schema": "../../../../node_modules/nx/schemas/project-schema.json",
|
||||||
|
"sourceRoot": "packages/pieces/community/ruby-code/src",
|
||||||
|
"projectType": "library",
|
||||||
|
"release": {
|
||||||
|
"version": {
|
||||||
|
"currentVersionResolver": "git-tag",
|
||||||
|
"preserveLocalDependencyProtocols": false,
|
||||||
|
"manifestRootsToUpdate": [
|
||||||
|
"dist/{projectRoot}"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"tags": [],
|
||||||
|
"targets": {
|
||||||
|
"build": {
|
||||||
|
"executor": "@nx/js:tsc",
|
||||||
|
"outputs": [
|
||||||
|
"{options.outputPath}"
|
||||||
|
],
|
||||||
|
"options": {
|
||||||
|
"outputPath": "dist/packages/pieces/community/ruby-code",
|
||||||
|
"tsConfig": "packages/pieces/community/ruby-code/tsconfig.lib.json",
|
||||||
|
"packageJson": "packages/pieces/community/ruby-code/package.json",
|
||||||
|
"main": "packages/pieces/community/ruby-code/src/index.ts",
|
||||||
|
"assets": [
|
||||||
|
"packages/pieces/community/ruby-code/*.md"
|
||||||
|
],
|
||||||
|
"buildableProjectDepsInPackageJsonType": "dependencies",
|
||||||
|
"updateBuildableProjectDepsInPackageJson": true
|
||||||
|
},
|
||||||
|
"dependsOn": [
|
||||||
|
"^build",
|
||||||
|
"prebuild"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"nx-release-publish": {
|
||||||
|
"options": {
|
||||||
|
"packageRoot": "dist/{projectRoot}"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"lint": {
|
||||||
|
"executor": "@nx/eslint:lint",
|
||||||
|
"outputs": [
|
||||||
|
"{options.outputFile}"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"prebuild": {
|
||||||
|
"executor": "nx:run-commands",
|
||||||
|
"options": {
|
||||||
|
"cwd": "packages/pieces/community/ruby-code",
|
||||||
|
"command": "bun install --no-save --silent"
|
||||||
|
},
|
||||||
|
"dependsOn": [
|
||||||
|
"^build"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
import { createPiece, PieceAuth } from '@activepieces/pieces-framework';
|
||||||
|
import { PieceCategory } from '@activepieces/shared';
|
||||||
|
import { runRubyCode } from './lib/run-ruby-code';
|
||||||
|
|
||||||
|
// Ruby logo - use official Ruby logo from ruby-lang.org
|
||||||
|
const RUBY_LOGO = 'https://www.ruby-lang.org/images/header-ruby-logo.png';
|
||||||
|
|
||||||
|
export const rubyCode = createPiece({
|
||||||
|
displayName: 'Ruby Code',
|
||||||
|
description: 'Execute Ruby code in your automations',
|
||||||
|
auth: PieceAuth.None(),
|
||||||
|
minimumSupportedRelease: '0.36.1',
|
||||||
|
logoUrl: RUBY_LOGO,
|
||||||
|
categories: [PieceCategory.CORE, PieceCategory.DEVELOPER_TOOLS],
|
||||||
|
authors: ['smoothschedule'],
|
||||||
|
actions: [runRubyCode],
|
||||||
|
triggers: [],
|
||||||
|
});
|
||||||
@@ -0,0 +1,113 @@
|
|||||||
|
import { createAction, Property } from '@activepieces/pieces-framework';
|
||||||
|
import { exec } from 'child_process';
|
||||||
|
import { promisify } from 'util';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
import * as os from 'os';
|
||||||
|
|
||||||
|
const execAsync = promisify(exec);
|
||||||
|
|
||||||
|
export const runRubyCode = createAction({
|
||||||
|
name: 'run_ruby_code',
|
||||||
|
displayName: 'Run Ruby Code',
|
||||||
|
description: 'Execute Ruby code and return the output. Use puts or print to output results.',
|
||||||
|
props: {
|
||||||
|
code: Property.LongText({
|
||||||
|
displayName: 'Ruby Code',
|
||||||
|
description: 'The Ruby code to execute. Use puts or print to output results that will be captured.',
|
||||||
|
required: true,
|
||||||
|
defaultValue: `# Example: Process input data
|
||||||
|
require 'json'
|
||||||
|
|
||||||
|
# Access inputs via the 'inputs' variable (parsed from JSON)
|
||||||
|
# Example: name = inputs['name'] || 'World'
|
||||||
|
|
||||||
|
# Your code here
|
||||||
|
result = "Hello from Ruby!"
|
||||||
|
|
||||||
|
# Print your output (will be captured as the action result)
|
||||||
|
puts result`,
|
||||||
|
}),
|
||||||
|
inputs: Property.Object({
|
||||||
|
displayName: 'Inputs',
|
||||||
|
description: 'Input data to pass to the Ruby code. Available as the `inputs` variable (Hash).',
|
||||||
|
required: false,
|
||||||
|
defaultValue: {},
|
||||||
|
}),
|
||||||
|
timeout: Property.Number({
|
||||||
|
displayName: 'Timeout (seconds)',
|
||||||
|
description: 'Maximum execution time in seconds',
|
||||||
|
required: false,
|
||||||
|
defaultValue: 30,
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const { code, inputs, timeout } = context.propsValue;
|
||||||
|
const timeoutMs = (timeout || 30) * 1000;
|
||||||
|
|
||||||
|
// Create a temporary file for the Ruby code
|
||||||
|
const tmpDir = os.tmpdir();
|
||||||
|
const scriptPath = path.join(tmpDir, `ap_ruby_${Date.now()}.rb`);
|
||||||
|
const inputPath = path.join(tmpDir, `ap_ruby_input_${Date.now()}.json`);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Write inputs to a JSON file
|
||||||
|
await fs.promises.writeFile(inputPath, JSON.stringify(inputs || {}));
|
||||||
|
|
||||||
|
// Escape the input path for Ruby
|
||||||
|
const escapedInputPath = inputPath.replace(/\\/g, '\\\\').replace(/'/g, "\\'");
|
||||||
|
|
||||||
|
// Wrap the user code to load inputs
|
||||||
|
const wrappedCode = `
|
||||||
|
require 'json'
|
||||||
|
|
||||||
|
# Load inputs from JSON file
|
||||||
|
inputs = JSON.parse(File.read('${escapedInputPath}'))
|
||||||
|
|
||||||
|
# User code starts here
|
||||||
|
${code}
|
||||||
|
`;
|
||||||
|
|
||||||
|
// Write the script to a temp file
|
||||||
|
await fs.promises.writeFile(scriptPath, wrappedCode);
|
||||||
|
|
||||||
|
// Execute Ruby
|
||||||
|
const { stdout, stderr } = await execAsync(`ruby "${scriptPath}"`, {
|
||||||
|
timeout: timeoutMs,
|
||||||
|
maxBuffer: 10 * 1024 * 1024, // 10MB buffer
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clean up temp files
|
||||||
|
await fs.promises.unlink(scriptPath).catch(() => {});
|
||||||
|
await fs.promises.unlink(inputPath).catch(() => {});
|
||||||
|
|
||||||
|
// Try to parse output as JSON, otherwise return as string
|
||||||
|
const output = stdout.trim();
|
||||||
|
let result: unknown;
|
||||||
|
try {
|
||||||
|
result = JSON.parse(output);
|
||||||
|
} catch {
|
||||||
|
result = output;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
output: result,
|
||||||
|
stdout: stdout,
|
||||||
|
stderr: stderr || null,
|
||||||
|
};
|
||||||
|
} catch (error: unknown) {
|
||||||
|
// Clean up temp files on error
|
||||||
|
await fs.promises.unlink(scriptPath).catch(() => {});
|
||||||
|
await fs.promises.unlink(inputPath).catch(() => {});
|
||||||
|
|
||||||
|
const execError = error as { stderr?: string; message?: string; killed?: boolean };
|
||||||
|
|
||||||
|
if (execError.killed) {
|
||||||
|
throw new Error(`Ruby script timed out after ${timeout} seconds`);
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(`Ruby execution failed: ${execError.stderr || execError.message}`);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"extends": "../../../../tsconfig.base.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"module": "commonjs",
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"strict": true,
|
||||||
|
"noImplicitOverride": true,
|
||||||
|
"noImplicitReturns": true,
|
||||||
|
"noFallthroughCasesInSwitch": true,
|
||||||
|
"noPropertyAccessFromIndexSignature": true
|
||||||
|
},
|
||||||
|
"files": [],
|
||||||
|
"include": [],
|
||||||
|
"references": [
|
||||||
|
{
|
||||||
|
"path": "./tsconfig.lib.json"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
{
|
||||||
|
"extends": "./tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"module": "commonjs",
|
||||||
|
"outDir": "../../../../dist/out-tsc",
|
||||||
|
"declaration": true,
|
||||||
|
"types": ["node"]
|
||||||
|
},
|
||||||
|
"exclude": ["jest.config.ts", "src/**/*.spec.ts", "src/**/*.test.ts"],
|
||||||
|
"include": ["src/**/*.ts"]
|
||||||
|
}
|
||||||
File diff suppressed because one or more lines are too long
@@ -5,3 +5,4 @@ export * from './find-events';
|
|||||||
export * from './list-resources';
|
export * from './list-resources';
|
||||||
export * from './list-services';
|
export * from './list-services';
|
||||||
export * from './list-inactive-customers';
|
export * from './list-inactive-customers';
|
||||||
|
export * from './list-customers';
|
||||||
|
|||||||
@@ -0,0 +1,102 @@
|
|||||||
|
import { Property, createAction } from '@activepieces/pieces-framework';
|
||||||
|
import { HttpMethod } from '@activepieces/pieces-common';
|
||||||
|
import { smoothScheduleAuth, SmoothScheduleAuth } from '../../index';
|
||||||
|
import { makeRequest } from '../common';
|
||||||
|
|
||||||
|
interface PaginatedResponse<T> {
|
||||||
|
results: T[];
|
||||||
|
count: number;
|
||||||
|
next: string | null;
|
||||||
|
previous: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Customer {
|
||||||
|
id: number;
|
||||||
|
email: string;
|
||||||
|
first_name: string;
|
||||||
|
last_name: string;
|
||||||
|
phone: string;
|
||||||
|
notes: string;
|
||||||
|
created_at: string;
|
||||||
|
updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const listCustomersAction = createAction({
|
||||||
|
auth: smoothScheduleAuth,
|
||||||
|
name: 'list_customers',
|
||||||
|
displayName: 'List Customers',
|
||||||
|
description: 'Get a list of customers from SmoothSchedule. Useful for customer lookup and bulk operations.',
|
||||||
|
props: {
|
||||||
|
search: Property.ShortText({
|
||||||
|
displayName: 'Search',
|
||||||
|
description: 'Search by name, email, or phone number',
|
||||||
|
required: false,
|
||||||
|
}),
|
||||||
|
limit: Property.Number({
|
||||||
|
displayName: 'Limit',
|
||||||
|
description: 'Maximum number of customers to return (default: 50, max: 500)',
|
||||||
|
required: false,
|
||||||
|
defaultValue: 50,
|
||||||
|
}),
|
||||||
|
offset: Property.Number({
|
||||||
|
displayName: 'Offset',
|
||||||
|
description: 'Number of customers to skip (for pagination)',
|
||||||
|
required: false,
|
||||||
|
defaultValue: 0,
|
||||||
|
}),
|
||||||
|
orderBy: Property.StaticDropdown({
|
||||||
|
displayName: 'Order By',
|
||||||
|
description: 'Sort order for results',
|
||||||
|
required: false,
|
||||||
|
options: {
|
||||||
|
options: [
|
||||||
|
{ label: 'Newest First', value: '-created_at' },
|
||||||
|
{ label: 'Oldest First', value: 'created_at' },
|
||||||
|
{ label: 'Name (A-Z)', value: 'first_name' },
|
||||||
|
{ label: 'Name (Z-A)', value: '-first_name' },
|
||||||
|
{ label: 'Email (A-Z)', value: 'email' },
|
||||||
|
{ label: 'Last Updated', value: '-updated_at' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
defaultValue: '-created_at',
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const props = context.propsValue;
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {};
|
||||||
|
|
||||||
|
if (props.search) {
|
||||||
|
queryParams['search'] = props.search;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clamp limit to reasonable range
|
||||||
|
const limit = Math.min(Math.max(props.limit || 50, 1), 500);
|
||||||
|
queryParams['limit'] = limit.toString();
|
||||||
|
|
||||||
|
if (props.offset && props.offset > 0) {
|
||||||
|
queryParams['offset'] = props.offset.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (props.orderBy) {
|
||||||
|
queryParams['ordering'] = props.orderBy;
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await makeRequest<PaginatedResponse<Customer>>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/customers/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
customers: response.results || [],
|
||||||
|
total_count: response.count || 0,
|
||||||
|
has_more: response.next !== null,
|
||||||
|
limit: limit,
|
||||||
|
offset: props.offset || 0,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
import { createAction } from '@activepieces/pieces-framework';
|
||||||
|
import { HttpMethod } from '@activepieces/pieces-common';
|
||||||
|
import { smoothScheduleAuth, SmoothScheduleAuth } from '../../index';
|
||||||
|
import { makeRequest } from '../common';
|
||||||
|
|
||||||
|
export const listEmailTemplatesAction = createAction({
|
||||||
|
auth: smoothScheduleAuth,
|
||||||
|
name: 'list_email_templates',
|
||||||
|
displayName: 'List Email Templates',
|
||||||
|
description: 'Get all available email templates (system and custom)',
|
||||||
|
props: {},
|
||||||
|
async run(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
|
||||||
|
const response = await makeRequest(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/emails/templates/'
|
||||||
|
);
|
||||||
|
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -0,0 +1,112 @@
|
|||||||
|
import { Property, createAction } from '@activepieces/pieces-framework';
|
||||||
|
import { HttpMethod } from '@activepieces/pieces-common';
|
||||||
|
import { smoothScheduleAuth, SmoothScheduleAuth } from '../../index';
|
||||||
|
import { makeRequest } from '../common';
|
||||||
|
|
||||||
|
export const sendEmailAction = createAction({
|
||||||
|
auth: smoothScheduleAuth,
|
||||||
|
name: 'send_email',
|
||||||
|
displayName: 'Send Email',
|
||||||
|
description: 'Send an email using a SmoothSchedule email template',
|
||||||
|
props: {
|
||||||
|
template_type: Property.StaticDropdown({
|
||||||
|
displayName: 'Template Type',
|
||||||
|
description: 'Choose whether to use a system template or a custom template',
|
||||||
|
required: true,
|
||||||
|
options: {
|
||||||
|
options: [
|
||||||
|
{ label: 'System Template', value: 'system' },
|
||||||
|
{ label: 'Custom Template', value: 'custom' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
email_type: Property.StaticDropdown({
|
||||||
|
displayName: 'System Email Type',
|
||||||
|
description: 'Select a system email template',
|
||||||
|
required: false,
|
||||||
|
options: {
|
||||||
|
options: [
|
||||||
|
{ label: 'Appointment Confirmation', value: 'appointment_confirmation' },
|
||||||
|
{ label: 'Appointment Reminder', value: 'appointment_reminder' },
|
||||||
|
{ label: 'Appointment Rescheduled', value: 'appointment_rescheduled' },
|
||||||
|
{ label: 'Appointment Cancelled', value: 'appointment_cancelled' },
|
||||||
|
{ label: 'Welcome Email', value: 'welcome_email' },
|
||||||
|
{ label: 'Password Reset', value: 'password_reset' },
|
||||||
|
{ label: 'Invoice', value: 'invoice' },
|
||||||
|
{ label: 'Payment Receipt', value: 'payment_receipt' },
|
||||||
|
{ label: 'Staff Invitation', value: 'staff_invitation' },
|
||||||
|
{ label: 'Customer Winback', value: 'customer_winback' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
template_slug: Property.ShortText({
|
||||||
|
displayName: 'Custom Template Slug',
|
||||||
|
description: 'The slug/identifier of your custom email template',
|
||||||
|
required: false,
|
||||||
|
}),
|
||||||
|
to_email: Property.ShortText({
|
||||||
|
displayName: 'Recipient Email',
|
||||||
|
description: 'The email address to send to',
|
||||||
|
required: true,
|
||||||
|
}),
|
||||||
|
subject_override: Property.ShortText({
|
||||||
|
displayName: 'Subject Override',
|
||||||
|
description: 'Override the template subject (optional)',
|
||||||
|
required: false,
|
||||||
|
}),
|
||||||
|
reply_to: Property.ShortText({
|
||||||
|
displayName: 'Reply-To Email',
|
||||||
|
description: 'Reply-to email address (optional)',
|
||||||
|
required: false,
|
||||||
|
}),
|
||||||
|
context: Property.Object({
|
||||||
|
displayName: 'Template Variables',
|
||||||
|
description: 'Variables to replace in the template (e.g., customer_name, appointment_date)',
|
||||||
|
required: false,
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const { template_type, email_type, template_slug, to_email, subject_override, reply_to, context: templateContext } = context.propsValue;
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
|
||||||
|
// Validate that the right template identifier is provided based on type
|
||||||
|
if (template_type === 'system' && !email_type) {
|
||||||
|
throw new Error('System Email Type is required when using System Template');
|
||||||
|
}
|
||||||
|
if (template_type === 'custom' && !template_slug) {
|
||||||
|
throw new Error('Custom Template Slug is required when using Custom Template');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build the request body
|
||||||
|
const requestBody: Record<string, unknown> = {
|
||||||
|
to_email,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (template_type === 'system') {
|
||||||
|
requestBody['email_type'] = email_type;
|
||||||
|
} else {
|
||||||
|
requestBody['template_slug'] = template_slug;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (subject_override) {
|
||||||
|
requestBody['subject_override'] = subject_override;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (reply_to) {
|
||||||
|
requestBody['reply_to'] = reply_to;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (templateContext && Object.keys(templateContext).length > 0) {
|
||||||
|
requestBody['context'] = templateContext;
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await makeRequest(
|
||||||
|
auth,
|
||||||
|
HttpMethod.POST,
|
||||||
|
'/emails/send/',
|
||||||
|
requestBody
|
||||||
|
);
|
||||||
|
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -0,0 +1,148 @@
|
|||||||
|
import { createTrigger, TriggerStrategy, Property } from '@activepieces/pieces-framework';
|
||||||
|
import { HttpMethod } from '@activepieces/pieces-common';
|
||||||
|
import { smoothScheduleAuth, SmoothScheduleAuth } from '../../index';
|
||||||
|
import { makeRequest } from '../common';
|
||||||
|
|
||||||
|
const TRIGGER_KEY = 'last_status_change_at';
|
||||||
|
|
||||||
|
// Event status options from SmoothSchedule backend
|
||||||
|
const EVENT_STATUSES = [
|
||||||
|
{ label: 'Any Status', value: '' },
|
||||||
|
{ label: 'Scheduled', value: 'SCHEDULED' },
|
||||||
|
{ label: 'En Route', value: 'EN_ROUTE' },
|
||||||
|
{ label: 'In Progress', value: 'IN_PROGRESS' },
|
||||||
|
{ label: 'Canceled', value: 'CANCELED' },
|
||||||
|
{ label: 'Completed', value: 'COMPLETED' },
|
||||||
|
{ label: 'Awaiting Payment', value: 'AWAITING_PAYMENT' },
|
||||||
|
{ label: 'Paid', value: 'PAID' },
|
||||||
|
{ label: 'No Show', value: 'NOSHOW' },
|
||||||
|
];
|
||||||
|
|
||||||
|
export const eventStatusChangedTrigger = createTrigger({
|
||||||
|
auth: smoothScheduleAuth,
|
||||||
|
name: 'event_status_changed',
|
||||||
|
displayName: 'Event Status Changed',
|
||||||
|
description: 'Triggers when an event status changes (e.g., Scheduled → In Progress).',
|
||||||
|
props: {
|
||||||
|
oldStatus: Property.StaticDropdown({
|
||||||
|
displayName: 'Previous Status (From)',
|
||||||
|
description: 'Only trigger when changing from this status (optional)',
|
||||||
|
required: false,
|
||||||
|
options: {
|
||||||
|
options: EVENT_STATUSES,
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
newStatus: Property.StaticDropdown({
|
||||||
|
displayName: 'New Status (To)',
|
||||||
|
description: 'Only trigger when changing to this status (optional)',
|
||||||
|
required: false,
|
||||||
|
options: {
|
||||||
|
options: EVENT_STATUSES,
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
type: TriggerStrategy.POLLING,
|
||||||
|
async onEnable(context) {
|
||||||
|
// Store the current timestamp as the starting point
|
||||||
|
await context.store.put(TRIGGER_KEY, new Date().toISOString());
|
||||||
|
},
|
||||||
|
async onDisable(context) {
|
||||||
|
await context.store.delete(TRIGGER_KEY);
|
||||||
|
},
|
||||||
|
async test(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const { oldStatus, newStatus } = context.propsValue;
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {
|
||||||
|
limit: '5',
|
||||||
|
};
|
||||||
|
|
||||||
|
if (oldStatus) {
|
||||||
|
queryParams['old_status'] = oldStatus;
|
||||||
|
}
|
||||||
|
if (newStatus) {
|
||||||
|
queryParams['new_status'] = newStatus;
|
||||||
|
}
|
||||||
|
|
||||||
|
const statusChanges = await makeRequest<Array<Record<string, unknown>>>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/events/status_changes/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
return statusChanges;
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const { oldStatus, newStatus } = context.propsValue;
|
||||||
|
|
||||||
|
const lastChangeAt = await context.store.get<string>(TRIGGER_KEY) || new Date(0).toISOString();
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {
|
||||||
|
changed_at__gt: lastChangeAt,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (oldStatus) {
|
||||||
|
queryParams['old_status'] = oldStatus;
|
||||||
|
}
|
||||||
|
if (newStatus) {
|
||||||
|
queryParams['new_status'] = newStatus;
|
||||||
|
}
|
||||||
|
|
||||||
|
const statusChanges = await makeRequest<Array<{ changed_at: string } & Record<string, unknown>>>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/events/status_changes/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
if (statusChanges.length > 0) {
|
||||||
|
// Update the last change timestamp
|
||||||
|
const maxChangedAt = statusChanges.reduce((max, c) =>
|
||||||
|
c.changed_at > max ? c.changed_at : max,
|
||||||
|
lastChangeAt
|
||||||
|
);
|
||||||
|
await context.store.put(TRIGGER_KEY, maxChangedAt);
|
||||||
|
}
|
||||||
|
|
||||||
|
return statusChanges;
|
||||||
|
},
|
||||||
|
sampleData: {
|
||||||
|
id: 1,
|
||||||
|
event_id: 12345,
|
||||||
|
event: {
|
||||||
|
id: 12345,
|
||||||
|
title: 'Consultation',
|
||||||
|
start_time: '2024-12-01T10:00:00Z',
|
||||||
|
end_time: '2024-12-01T11:00:00Z',
|
||||||
|
status: 'IN_PROGRESS',
|
||||||
|
service: {
|
||||||
|
id: 1,
|
||||||
|
name: 'Consultation',
|
||||||
|
},
|
||||||
|
customer: {
|
||||||
|
id: 100,
|
||||||
|
first_name: 'John',
|
||||||
|
last_name: 'Doe',
|
||||||
|
email: 'john.doe@example.com',
|
||||||
|
},
|
||||||
|
resources: [
|
||||||
|
{ id: 1, name: 'Dr. Smith', type: 'STAFF' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
old_status: 'SCHEDULED',
|
||||||
|
old_status_display: 'Scheduled',
|
||||||
|
new_status: 'IN_PROGRESS',
|
||||||
|
new_status_display: 'In Progress',
|
||||||
|
changed_by: 'John Smith',
|
||||||
|
changed_by_email: 'john@example.com',
|
||||||
|
changed_at: '2024-12-01T10:05:00Z',
|
||||||
|
notes: 'Started working on the job',
|
||||||
|
source: 'mobile_app',
|
||||||
|
latitude: 40.7128,
|
||||||
|
longitude: -74.0060,
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -1,3 +1,6 @@
|
|||||||
export * from './event-created';
|
export * from './event-created';
|
||||||
export * from './event-updated';
|
export * from './event-updated';
|
||||||
export * from './event-cancelled';
|
export * from './event-cancelled';
|
||||||
|
export * from './event-status-changed';
|
||||||
|
export * from './payment-received';
|
||||||
|
export * from './upcoming-events';
|
||||||
|
|||||||
@@ -0,0 +1,169 @@
|
|||||||
|
import { createTrigger, TriggerStrategy, Property } from '@activepieces/pieces-framework';
|
||||||
|
import { HttpMethod } from '@activepieces/pieces-common';
|
||||||
|
import { smoothScheduleAuth, SmoothScheduleAuth } from '../../index';
|
||||||
|
import { makeRequest } from '../common';
|
||||||
|
|
||||||
|
const TRIGGER_KEY = 'last_payment_check_timestamp';
|
||||||
|
|
||||||
|
interface PaymentData {
|
||||||
|
id: number;
|
||||||
|
payment_intent_id: string;
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
type: 'deposit' | 'final';
|
||||||
|
status: string;
|
||||||
|
created_at: string;
|
||||||
|
completed_at: string;
|
||||||
|
event: {
|
||||||
|
id: number;
|
||||||
|
title: string;
|
||||||
|
start_time: string;
|
||||||
|
end_time: string;
|
||||||
|
status: string;
|
||||||
|
deposit_amount: string | null;
|
||||||
|
final_price: string | null;
|
||||||
|
remaining_balance: string | null;
|
||||||
|
};
|
||||||
|
service: {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
price: string;
|
||||||
|
} | null;
|
||||||
|
customer: {
|
||||||
|
id: number;
|
||||||
|
first_name: string;
|
||||||
|
last_name: string;
|
||||||
|
email: string;
|
||||||
|
phone: string;
|
||||||
|
} | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const SAMPLE_PAYMENT_DATA: PaymentData = {
|
||||||
|
id: 12345,
|
||||||
|
payment_intent_id: 'pi_3QDEr5GvIfP3a7s90bcd1234',
|
||||||
|
amount: '50.00',
|
||||||
|
currency: 'usd',
|
||||||
|
type: 'deposit',
|
||||||
|
status: 'SUCCEEDED',
|
||||||
|
created_at: '2024-12-01T10:00:00Z',
|
||||||
|
completed_at: '2024-12-01T10:00:05Z',
|
||||||
|
event: {
|
||||||
|
id: 100,
|
||||||
|
title: 'Consultation with John Doe',
|
||||||
|
start_time: '2024-12-15T14:00:00Z',
|
||||||
|
end_time: '2024-12-15T15:00:00Z',
|
||||||
|
status: 'SCHEDULED',
|
||||||
|
deposit_amount: '50.00',
|
||||||
|
final_price: '200.00',
|
||||||
|
remaining_balance: '150.00',
|
||||||
|
},
|
||||||
|
service: {
|
||||||
|
id: 1,
|
||||||
|
name: 'Consultation',
|
||||||
|
price: '200.00',
|
||||||
|
},
|
||||||
|
customer: {
|
||||||
|
id: 50,
|
||||||
|
first_name: 'John',
|
||||||
|
last_name: 'Doe',
|
||||||
|
email: 'john.doe@example.com',
|
||||||
|
phone: '+1-555-0100',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
export const paymentReceivedTrigger = createTrigger({
|
||||||
|
auth: smoothScheduleAuth,
|
||||||
|
name: 'payment_received',
|
||||||
|
displayName: 'Payment Received',
|
||||||
|
description: 'Triggers when a payment is successfully completed in SmoothSchedule.',
|
||||||
|
props: {
|
||||||
|
paymentType: Property.StaticDropdown({
|
||||||
|
displayName: 'Payment Type',
|
||||||
|
description: 'Only trigger for specific payment types',
|
||||||
|
required: false,
|
||||||
|
options: {
|
||||||
|
options: [
|
||||||
|
{ label: 'All Payments', value: 'all' },
|
||||||
|
{ label: 'Deposit Payments', value: 'deposit' },
|
||||||
|
{ label: 'Final Payments', value: 'final' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
defaultValue: 'all',
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
type: TriggerStrategy.POLLING,
|
||||||
|
async onEnable(context) {
|
||||||
|
// Store current timestamp as starting point
|
||||||
|
await context.store.put(TRIGGER_KEY, new Date().toISOString());
|
||||||
|
},
|
||||||
|
async onDisable(context) {
|
||||||
|
await context.store.delete(TRIGGER_KEY);
|
||||||
|
},
|
||||||
|
async test(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const { paymentType } = context.propsValue;
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {
|
||||||
|
limit: '5',
|
||||||
|
};
|
||||||
|
|
||||||
|
if (paymentType && paymentType !== 'all') {
|
||||||
|
queryParams['type'] = paymentType;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const payments = await makeRequest<PaymentData[]>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/payments/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
// Return real data if available, otherwise return sample data
|
||||||
|
if (payments && payments.length > 0) {
|
||||||
|
return payments;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
// Fall through to sample data on error
|
||||||
|
console.error('Error fetching payments for sample data:', error);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return static sample data if no real payments exist
|
||||||
|
return [SAMPLE_PAYMENT_DATA];
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const { paymentType } = context.propsValue;
|
||||||
|
|
||||||
|
const lastCheck = await context.store.get<string>(TRIGGER_KEY) || new Date(0).toISOString();
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {
|
||||||
|
'created_at__gt': lastCheck,
|
||||||
|
limit: '100',
|
||||||
|
};
|
||||||
|
|
||||||
|
if (paymentType && paymentType !== 'all') {
|
||||||
|
queryParams['type'] = paymentType;
|
||||||
|
}
|
||||||
|
|
||||||
|
const payments = await makeRequest<PaymentData[]>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/payments/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
if (payments.length > 0) {
|
||||||
|
// Update the last check timestamp to the most recent payment
|
||||||
|
const mostRecent = payments.reduce((latest, p) =>
|
||||||
|
new Date(p.completed_at) > new Date(latest.completed_at) ? p : latest
|
||||||
|
);
|
||||||
|
await context.store.put(TRIGGER_KEY, mostRecent.completed_at);
|
||||||
|
}
|
||||||
|
|
||||||
|
return payments;
|
||||||
|
},
|
||||||
|
sampleData: SAMPLE_PAYMENT_DATA,
|
||||||
|
});
|
||||||
@@ -0,0 +1,190 @@
|
|||||||
|
import { createTrigger, TriggerStrategy, Property } from '@activepieces/pieces-framework';
|
||||||
|
import { HttpMethod } from '@activepieces/pieces-common';
|
||||||
|
import { smoothScheduleAuth, SmoothScheduleAuth } from '../../index';
|
||||||
|
import { makeRequest } from '../common';
|
||||||
|
|
||||||
|
const TRIGGER_KEY_PREFIX = 'reminder_sent_event_ids';
|
||||||
|
|
||||||
|
interface UpcomingEventData {
|
||||||
|
id: number;
|
||||||
|
title: string;
|
||||||
|
start_time: string;
|
||||||
|
end_time: string;
|
||||||
|
status: string;
|
||||||
|
hours_until_start: number;
|
||||||
|
reminder_hours_before: number;
|
||||||
|
should_send_reminder: boolean;
|
||||||
|
service: {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
duration: number;
|
||||||
|
price: string;
|
||||||
|
reminder_enabled: boolean;
|
||||||
|
reminder_hours_before: number;
|
||||||
|
} | null;
|
||||||
|
customer: {
|
||||||
|
id: number;
|
||||||
|
first_name: string;
|
||||||
|
last_name: string;
|
||||||
|
email: string;
|
||||||
|
phone: string;
|
||||||
|
} | null;
|
||||||
|
resources: Array<{
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
}>;
|
||||||
|
notes: string | null;
|
||||||
|
location: {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
address: string;
|
||||||
|
} | null;
|
||||||
|
created_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const upcomingEventsTrigger = createTrigger({
|
||||||
|
auth: smoothScheduleAuth,
|
||||||
|
name: 'upcoming_events',
|
||||||
|
displayName: 'Upcoming Event (Reminder)',
|
||||||
|
description: 'Triggers for events starting soon. Use for sending appointment reminders.',
|
||||||
|
props: {
|
||||||
|
hoursAhead: Property.Number({
|
||||||
|
displayName: 'Hours Ahead',
|
||||||
|
description: 'Trigger for events starting within this many hours (matches service reminder settings)',
|
||||||
|
required: false,
|
||||||
|
defaultValue: 24,
|
||||||
|
}),
|
||||||
|
onlyIfReminderEnabled: Property.Checkbox({
|
||||||
|
displayName: 'Only if Reminder Enabled',
|
||||||
|
description: 'Only trigger for events where the service has reminders enabled',
|
||||||
|
required: false,
|
||||||
|
defaultValue: true,
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
type: TriggerStrategy.POLLING,
|
||||||
|
async onEnable(context) {
|
||||||
|
// Initialize with empty set of processed event IDs
|
||||||
|
await context.store.put(TRIGGER_KEY_PREFIX, JSON.stringify([]));
|
||||||
|
},
|
||||||
|
async onDisable(context) {
|
||||||
|
await context.store.delete(TRIGGER_KEY_PREFIX);
|
||||||
|
},
|
||||||
|
async test(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const { hoursAhead } = context.propsValue;
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {
|
||||||
|
hours_ahead: String(hoursAhead || 24),
|
||||||
|
limit: '5',
|
||||||
|
};
|
||||||
|
|
||||||
|
const events = await makeRequest<UpcomingEventData[]>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/events/upcoming/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
return events;
|
||||||
|
},
|
||||||
|
async run(context) {
|
||||||
|
const auth = context.auth as SmoothScheduleAuth;
|
||||||
|
const { hoursAhead, onlyIfReminderEnabled } = context.propsValue;
|
||||||
|
|
||||||
|
// Get list of event IDs we've already processed
|
||||||
|
const processedIdsJson = await context.store.get<string>(TRIGGER_KEY_PREFIX) || '[]';
|
||||||
|
let processedIds: number[] = [];
|
||||||
|
try {
|
||||||
|
processedIds = JSON.parse(processedIdsJson);
|
||||||
|
} catch {
|
||||||
|
processedIds = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
const queryParams: Record<string, string> = {
|
||||||
|
hours_ahead: String(hoursAhead || 24),
|
||||||
|
limit: '100',
|
||||||
|
};
|
||||||
|
|
||||||
|
const events = await makeRequest<UpcomingEventData[]>(
|
||||||
|
auth,
|
||||||
|
HttpMethod.GET,
|
||||||
|
'/events/upcoming/',
|
||||||
|
undefined,
|
||||||
|
queryParams
|
||||||
|
);
|
||||||
|
|
||||||
|
// Filter to only events that should trigger reminders
|
||||||
|
let filteredEvents = events.filter((event) => {
|
||||||
|
// Skip if already processed
|
||||||
|
if (processedIds.includes(event.id)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if reminder is appropriate based on service settings
|
||||||
|
if (!event.should_send_reminder) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if service has reminders enabled
|
||||||
|
if (onlyIfReminderEnabled && event.service && !event.service.reminder_enabled) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update the processed IDs list
|
||||||
|
if (filteredEvents.length > 0) {
|
||||||
|
const newProcessedIds = [...processedIds, ...filteredEvents.map((e) => e.id)];
|
||||||
|
// Keep only last 1000 IDs to prevent unbounded growth
|
||||||
|
const trimmedIds = newProcessedIds.slice(-1000);
|
||||||
|
await context.store.put(TRIGGER_KEY_PREFIX, JSON.stringify(trimmedIds));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Also clean up old IDs (events that have already passed)
|
||||||
|
// This runs periodically to keep the list manageable
|
||||||
|
if (Math.random() < 0.1) { // 10% of runs
|
||||||
|
const currentIds = events.map((e) => e.id);
|
||||||
|
const activeProcessedIds = processedIds.filter((id) => currentIds.includes(id));
|
||||||
|
await context.store.put(TRIGGER_KEY_PREFIX, JSON.stringify(activeProcessedIds));
|
||||||
|
}
|
||||||
|
|
||||||
|
return filteredEvents;
|
||||||
|
},
|
||||||
|
sampleData: {
|
||||||
|
id: 12345,
|
||||||
|
title: 'Consultation with John Doe',
|
||||||
|
start_time: '2024-12-15T14:00:00Z',
|
||||||
|
end_time: '2024-12-15T15:00:00Z',
|
||||||
|
status: 'SCHEDULED',
|
||||||
|
hours_until_start: 23.5,
|
||||||
|
reminder_hours_before: 24,
|
||||||
|
should_send_reminder: true,
|
||||||
|
service: {
|
||||||
|
id: 1,
|
||||||
|
name: 'Consultation',
|
||||||
|
duration: 60,
|
||||||
|
price: '200.00',
|
||||||
|
reminder_enabled: true,
|
||||||
|
reminder_hours_before: 24,
|
||||||
|
},
|
||||||
|
customer: {
|
||||||
|
id: 50,
|
||||||
|
first_name: 'John',
|
||||||
|
last_name: 'Doe',
|
||||||
|
email: 'john.doe@example.com',
|
||||||
|
phone: '+1-555-0100',
|
||||||
|
},
|
||||||
|
resources: [
|
||||||
|
{ id: 1, name: 'Dr. Smith' },
|
||||||
|
],
|
||||||
|
notes: 'First-time client',
|
||||||
|
location: {
|
||||||
|
id: 1,
|
||||||
|
name: 'Main Office',
|
||||||
|
address: '123 Business St',
|
||||||
|
},
|
||||||
|
created_at: '2024-12-01T10:00:00Z',
|
||||||
|
},
|
||||||
|
});
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
import { t } from 'i18next';
|
import { t } from 'i18next';
|
||||||
import { Plus, Globe } from 'lucide-react';
|
import { Plus, Globe } from 'lucide-react';
|
||||||
import { useState } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import { useFormContext } from 'react-hook-form';
|
import { useFormContext } from 'react-hook-form';
|
||||||
|
|
||||||
import { AutoFormFieldWrapper } from '@/app/builder/piece-properties/auto-form-field-wrapper';
|
import { AutoFormFieldWrapper } from '@/app/builder/piece-properties/auto-form-field-wrapper';
|
||||||
@@ -80,6 +80,27 @@ function ConnectionSelect(params: ConnectionSelectProps) {
|
|||||||
PropertyExecutionType.DYNAMIC;
|
PropertyExecutionType.DYNAMIC;
|
||||||
const isPLatformAdmin = useIsPlatformAdmin();
|
const isPLatformAdmin = useIsPlatformAdmin();
|
||||||
|
|
||||||
|
// Auto-select connection with autoSelect metadata if no connection is selected
|
||||||
|
useEffect(() => {
|
||||||
|
if (isLoadingConnections || !connections?.data) return;
|
||||||
|
|
||||||
|
const currentAuth = form.getValues().settings.input.auth;
|
||||||
|
// Only auto-select if no connection is currently selected
|
||||||
|
if (currentAuth && removeBrackets(currentAuth)) return;
|
||||||
|
|
||||||
|
// Find a connection with autoSelect metadata
|
||||||
|
const autoSelectConnection = connections.data.find(
|
||||||
|
(connection) => (connection as any).metadata?.autoSelect === true
|
||||||
|
);
|
||||||
|
|
||||||
|
if (autoSelectConnection) {
|
||||||
|
form.setValue('settings.input.auth', addBrackets(autoSelectConnection.externalId), {
|
||||||
|
shouldValidate: true,
|
||||||
|
shouldDirty: true,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}, [connections?.data, isLoadingConnections, form]);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<FormField
|
<FormField
|
||||||
control={form.control}
|
control={form.control}
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { t } from 'i18next';
|
import { t } from 'i18next';
|
||||||
import { ArrowLeft, Search, SearchX } from 'lucide-react';
|
import { ArrowLeft, Search, SearchX, Sparkles, Building2 } from 'lucide-react';
|
||||||
import { useState } from 'react';
|
import { useState } from 'react';
|
||||||
import { useNavigate } from 'react-router-dom';
|
import { useNavigate } from 'react-router-dom';
|
||||||
|
|
||||||
@@ -24,14 +24,19 @@ import {
|
|||||||
import { LoadingSpinner } from '@/components/ui/spinner';
|
import { LoadingSpinner } from '@/components/ui/spinner';
|
||||||
import { TemplateCard } from '@/features/templates/components/template-card';
|
import { TemplateCard } from '@/features/templates/components/template-card';
|
||||||
import { TemplateDetailsView } from '@/features/templates/components/template-details-view';
|
import { TemplateDetailsView } from '@/features/templates/components/template-details-view';
|
||||||
import { useTemplates } from '@/features/templates/hooks/templates-hook';
|
import { useAllTemplates } from '@/features/templates/hooks/templates-hook';
|
||||||
import { userHooks } from '@/hooks/user-hooks';
|
import { userHooks } from '@/hooks/user-hooks';
|
||||||
import { PlatformRole, Template, TemplateType } from '@activepieces/shared';
|
import { PlatformRole, Template } from '@activepieces/shared';
|
||||||
|
|
||||||
export const ExplorePage = () => {
|
export const ExplorePage = () => {
|
||||||
const { filteredTemplates, isLoading, search, setSearch } = useTemplates({
|
const {
|
||||||
type: TemplateType.OFFICIAL,
|
filteredCustomTemplates,
|
||||||
});
|
filteredOfficialTemplates,
|
||||||
|
filteredTemplates,
|
||||||
|
isLoading,
|
||||||
|
search,
|
||||||
|
setSearch,
|
||||||
|
} = useAllTemplates();
|
||||||
const [selectedTemplate, setSelectedTemplate] = useState<Template | null>(
|
const [selectedTemplate, setSelectedTemplate] = useState<Template | null>(
|
||||||
null,
|
null,
|
||||||
);
|
);
|
||||||
@@ -47,6 +52,20 @@ export const ExplorePage = () => {
|
|||||||
setSelectedTemplate(null);
|
setSelectedTemplate(null);
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const renderTemplateGrid = (templates: Template[]) => (
|
||||||
|
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
|
||||||
|
{templates.map((template) => (
|
||||||
|
<TemplateCard
|
||||||
|
key={template.id}
|
||||||
|
template={template}
|
||||||
|
onSelectTemplate={(template) => {
|
||||||
|
setSelectedTemplate(template);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div>
|
<div>
|
||||||
<ProjectDashboardPageHeader title={t('Explore Templates')} />
|
<ProjectDashboardPageHeader title={t('Explore Templates')} />
|
||||||
@@ -67,7 +86,7 @@ export const ExplorePage = () => {
|
|||||||
</div>
|
</div>
|
||||||
) : (
|
) : (
|
||||||
<>
|
<>
|
||||||
{filteredTemplates?.length === 0 && (
|
{filteredTemplates.length === 0 && (
|
||||||
<Empty className="min-h-[300px]">
|
<Empty className="min-h-[300px]">
|
||||||
<EmptyHeader className="max-w-xl">
|
<EmptyHeader className="max-w-xl">
|
||||||
<EmptyMedia variant="icon">
|
<EmptyMedia variant="icon">
|
||||||
@@ -93,17 +112,38 @@ export const ExplorePage = () => {
|
|||||||
)}
|
)}
|
||||||
</Empty>
|
</Empty>
|
||||||
)}
|
)}
|
||||||
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4 pb-4">
|
|
||||||
{filteredTemplates?.map((template) => (
|
{/* Custom Templates Section (SmoothSchedule-specific) */}
|
||||||
<TemplateCard
|
{filteredCustomTemplates.length > 0 && (
|
||||||
key={template.id}
|
<div className="mb-8">
|
||||||
template={template}
|
<div className="flex items-center gap-2 mb-4">
|
||||||
onSelectTemplate={(template) => {
|
<Building2 className="w-5 h-5 text-primary" />
|
||||||
setSelectedTemplate(template);
|
<h2 className="text-lg font-semibold">
|
||||||
}}
|
{t('SmoothSchedule Templates')}
|
||||||
/>
|
</h2>
|
||||||
))}
|
<span className="text-sm text-muted-foreground">
|
||||||
|
({filteredCustomTemplates.length})
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
|
{renderTemplateGrid(filteredCustomTemplates)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Official Templates Section (from Activepieces cloud) */}
|
||||||
|
{filteredOfficialTemplates.length > 0 && (
|
||||||
|
<div className="mb-8">
|
||||||
|
<div className="flex items-center gap-2 mb-4">
|
||||||
|
<Sparkles className="w-5 h-5 text-amber-500" />
|
||||||
|
<h2 className="text-lg font-semibold">
|
||||||
|
{t('Community Templates')}
|
||||||
|
</h2>
|
||||||
|
<span className="text-sm text-muted-foreground">
|
||||||
|
({filteredOfficialTemplates.length})
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{renderTemplateGrid(filteredOfficialTemplates)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,15 +1,64 @@
|
|||||||
import { t } from 'i18next';
|
import { t } from 'i18next';
|
||||||
|
import { useEffect, useState } from 'react';
|
||||||
|
|
||||||
import { flagsHooks } from '@/hooks/flags-hooks';
|
import { flagsHooks } from '@/hooks/flags-hooks';
|
||||||
|
import { useTheme } from '@/components/theme-provider';
|
||||||
|
|
||||||
const FullLogo = () => {
|
const FullLogo = () => {
|
||||||
const branding = flagsHooks.useWebsiteBranding();
|
const branding = flagsHooks.useWebsiteBranding();
|
||||||
|
const { theme } = useTheme();
|
||||||
|
|
||||||
|
// Track resolved theme from DOM (handles 'system' theme correctly)
|
||||||
|
const [isDark, setIsDark] = useState(() =>
|
||||||
|
document.documentElement.classList.contains('dark')
|
||||||
|
);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
// Update when theme changes - check the actual applied class
|
||||||
|
const checkDark = () => {
|
||||||
|
setIsDark(document.documentElement.classList.contains('dark'));
|
||||||
|
};
|
||||||
|
checkDark();
|
||||||
|
|
||||||
|
// Observe class changes on documentElement
|
||||||
|
const observer = new MutationObserver((mutations) => {
|
||||||
|
for (const mutation of mutations) {
|
||||||
|
if (mutation.attributeName === 'class') {
|
||||||
|
checkDark();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
observer.observe(document.documentElement, { attributes: true });
|
||||||
|
|
||||||
|
return () => observer.disconnect();
|
||||||
|
}, [theme]);
|
||||||
|
|
||||||
|
// Support dark mode by switching logo URLs
|
||||||
|
// Light logo (dark text) for light mode, dark logo (light text) for dark mode
|
||||||
|
const baseLogoUrl = branding.logos.fullLogoUrl;
|
||||||
|
|
||||||
|
// Compute the appropriate logo URL based on theme
|
||||||
|
let logoUrl = baseLogoUrl;
|
||||||
|
if (isDark) {
|
||||||
|
// Need dark logo (light text for dark background)
|
||||||
|
if (baseLogoUrl.includes('-light.svg')) {
|
||||||
|
logoUrl = baseLogoUrl.replace('-light.svg', '-dark.svg');
|
||||||
|
} else if (!baseLogoUrl.includes('-dark.svg')) {
|
||||||
|
logoUrl = baseLogoUrl.replace(/\.svg$/, '-dark.svg');
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Need light logo (dark text for light background)
|
||||||
|
if (baseLogoUrl.includes('-dark.svg')) {
|
||||||
|
logoUrl = baseLogoUrl.replace('-dark.svg', '-light.svg');
|
||||||
|
}
|
||||||
|
// Otherwise use base URL as-is (assumed to be light version)
|
||||||
|
}
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="h-[60px]">
|
<div className="h-[60px]">
|
||||||
<img
|
<img
|
||||||
className="h-full"
|
className="h-full"
|
||||||
src={branding.logos.fullLogoUrl}
|
src={logoUrl}
|
||||||
alt={t('logo')}
|
alt={t('logo')}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -123,7 +123,15 @@ export const billingQueries = {
|
|||||||
usePlatformSubscription: (platformId: string) => {
|
usePlatformSubscription: (platformId: string) => {
|
||||||
return useQuery({
|
return useQuery({
|
||||||
queryKey: billingKeys.platformSubscription(platformId),
|
queryKey: billingKeys.platformSubscription(platformId),
|
||||||
queryFn: platformBillingApi.getSubscriptionInfo,
|
queryFn: async () => {
|
||||||
|
try {
|
||||||
|
return await platformBillingApi.getSubscriptionInfo();
|
||||||
|
} catch {
|
||||||
|
// Return null if endpoint doesn't exist (community edition)
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
retry: false, // Don't retry on failure
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -22,8 +22,8 @@ import { ScrollArea } from '@/components/ui/scroll-area';
|
|||||||
import { LoadingSpinner } from '@/components/ui/spinner';
|
import { LoadingSpinner } from '@/components/ui/spinner';
|
||||||
import { TemplateCard } from '@/features/templates/components/template-card';
|
import { TemplateCard } from '@/features/templates/components/template-card';
|
||||||
import { TemplateDetailsView } from '@/features/templates/components/template-details-view';
|
import { TemplateDetailsView } from '@/features/templates/components/template-details-view';
|
||||||
import { useTemplates } from '@/features/templates/hooks/templates-hook';
|
import { useAllTemplates } from '@/features/templates/hooks/templates-hook';
|
||||||
import { Template, TemplateType } from '@activepieces/shared';
|
import { Template } from '@activepieces/shared';
|
||||||
|
|
||||||
const SelectFlowTemplateDialog = ({
|
const SelectFlowTemplateDialog = ({
|
||||||
children,
|
children,
|
||||||
@@ -32,9 +32,7 @@ const SelectFlowTemplateDialog = ({
|
|||||||
children: React.ReactNode;
|
children: React.ReactNode;
|
||||||
folderId: string;
|
folderId: string;
|
||||||
}) => {
|
}) => {
|
||||||
const { filteredTemplates, isLoading, search, setSearch } = useTemplates({
|
const { filteredTemplates, isLoading, search, setSearch } = useAllTemplates();
|
||||||
type: TemplateType.CUSTOM,
|
|
||||||
});
|
|
||||||
const carousel = useRef<CarouselApi>();
|
const carousel = useRef<CarouselApi>();
|
||||||
const [selectedTemplate, setSelectedTemplate] = useState<Template | null>(
|
const [selectedTemplate, setSelectedTemplate] = useState<Template | null>(
|
||||||
null,
|
null,
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ export const projectMembersHooks = {
|
|||||||
const query = useQuery<ProjectMemberWithUser[]>({
|
const query = useQuery<ProjectMemberWithUser[]>({
|
||||||
queryKey: ['project-members', authenticationSession.getProjectId()],
|
queryKey: ['project-members', authenticationSession.getProjectId()],
|
||||||
queryFn: async () => {
|
queryFn: async () => {
|
||||||
|
try {
|
||||||
const projectId = authenticationSession.getProjectId();
|
const projectId = authenticationSession.getProjectId();
|
||||||
assertNotNullOrUndefined(projectId, 'Project ID is null');
|
assertNotNullOrUndefined(projectId, 'Project ID is null');
|
||||||
const res = await projectMembersApi.list({
|
const res = await projectMembersApi.list({
|
||||||
@@ -21,11 +22,16 @@ export const projectMembersHooks = {
|
|||||||
limit: 100,
|
limit: 100,
|
||||||
});
|
});
|
||||||
return res.data;
|
return res.data;
|
||||||
|
} catch {
|
||||||
|
// Return empty array if endpoint doesn't exist (community edition)
|
||||||
|
return [];
|
||||||
|
}
|
||||||
},
|
},
|
||||||
staleTime: Infinity,
|
staleTime: Infinity,
|
||||||
|
retry: false, // Don't retry on failure
|
||||||
});
|
});
|
||||||
return {
|
return {
|
||||||
projectMembers: query.data,
|
projectMembers: query.data ?? [],
|
||||||
isLoading: query.isLoading,
|
isLoading: query.isLoading,
|
||||||
refetch: query.refetch,
|
refetch: query.refetch,
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -79,10 +79,14 @@ export const TemplateCard = ({
|
|||||||
className="rounded-lg border border-solid border-dividers overflow-hidden"
|
className="rounded-lg border border-solid border-dividers overflow-hidden"
|
||||||
>
|
>
|
||||||
<div className="flex items-center gap-2 p-4">
|
<div className="flex items-center gap-2 p-4">
|
||||||
|
{template.flows && template.flows.length > 0 && template.flows[0].trigger ? (
|
||||||
<PieceIconList
|
<PieceIconList
|
||||||
trigger={template.flows![0].trigger}
|
trigger={template.flows[0].trigger}
|
||||||
maxNumberOfIconsToShow={2}
|
maxNumberOfIconsToShow={2}
|
||||||
/>
|
/>
|
||||||
|
) : (
|
||||||
|
<div className="h-8 w-8 rounded bg-muted" />
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
<div className="text-sm font-medium px-4 min-h-16">{template.name}</div>
|
<div className="text-sm font-medium px-4 min-h-16">{template.name}</div>
|
||||||
<div className="py-2 px-4 gap-1 flex items-center">
|
<div className="py-2 px-4 gap-1 flex items-center">
|
||||||
|
|||||||
@@ -13,11 +13,15 @@ export const TemplateDetailsView = ({ template }: TemplateDetailsViewProps) => {
|
|||||||
return (
|
return (
|
||||||
<div className="px-2">
|
<div className="px-2">
|
||||||
<div className="mb-4 p-8 flex items-center justify-center gap-2 width-full bg-green-300 rounded-lg">
|
<div className="mb-4 p-8 flex items-center justify-center gap-2 width-full bg-green-300 rounded-lg">
|
||||||
|
{template.flows && template.flows.length > 0 && template.flows[0].trigger ? (
|
||||||
<PieceIconList
|
<PieceIconList
|
||||||
size="xxl"
|
size="xxl"
|
||||||
trigger={template.flows![0].trigger}
|
trigger={template.flows[0].trigger}
|
||||||
maxNumberOfIconsToShow={3}
|
maxNumberOfIconsToShow={3}
|
||||||
/>
|
/>
|
||||||
|
) : (
|
||||||
|
<div className="h-16 w-16 rounded bg-muted" />
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
<ScrollArea className="px-2 min-h-[156px] h-[calc(70vh-144px)] max-h-[536px]">
|
<ScrollArea className="px-2 min-h-[156px] h-[calc(70vh-144px)] max-h-[536px]">
|
||||||
<div className="mb-4 text-lg font-medium font-black">
|
<div className="mb-4 text-lg font-medium font-black">
|
||||||
|
|||||||
@@ -1,7 +1,11 @@
|
|||||||
import { useQuery } from '@tanstack/react-query';
|
import { useQuery } from '@tanstack/react-query';
|
||||||
import { useState } from 'react';
|
import { useState } from 'react';
|
||||||
|
|
||||||
import { ListTemplatesRequestQuery, Template } from '@activepieces/shared';
|
import {
|
||||||
|
ListTemplatesRequestQuery,
|
||||||
|
Template,
|
||||||
|
TemplateType,
|
||||||
|
} from '@activepieces/shared';
|
||||||
|
|
||||||
import { templatesApi } from '../lib/templates-api';
|
import { templatesApi } from '../lib/templates-api';
|
||||||
|
|
||||||
@@ -9,7 +13,7 @@ export const useTemplates = (request: ListTemplatesRequestQuery) => {
|
|||||||
const [search, setSearch] = useState<string>('');
|
const [search, setSearch] = useState<string>('');
|
||||||
|
|
||||||
const { data: templates, isLoading } = useQuery<Template[], Error>({
|
const { data: templates, isLoading } = useQuery<Template[], Error>({
|
||||||
queryKey: ['templates'],
|
queryKey: ['templates', request.type],
|
||||||
queryFn: async () => {
|
queryFn: async () => {
|
||||||
const templates = await templatesApi.list(request);
|
const templates = await templatesApi.list(request);
|
||||||
return templates.data;
|
return templates.data;
|
||||||
@@ -34,3 +38,86 @@ export const useTemplates = (request: ListTemplatesRequestQuery) => {
|
|||||||
setSearch,
|
setSearch,
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Hook to fetch both custom (platform) and official templates
|
||||||
|
*/
|
||||||
|
export const useAllTemplates = () => {
|
||||||
|
const [search, setSearch] = useState<string>('');
|
||||||
|
|
||||||
|
// Fetch custom templates (platform-specific)
|
||||||
|
const {
|
||||||
|
data: customTemplates,
|
||||||
|
isLoading: isLoadingCustom,
|
||||||
|
} = useQuery<Template[], Error>({
|
||||||
|
queryKey: ['templates', TemplateType.CUSTOM],
|
||||||
|
queryFn: async () => {
|
||||||
|
try {
|
||||||
|
const templates = await templatesApi.list({
|
||||||
|
type: TemplateType.CUSTOM,
|
||||||
|
});
|
||||||
|
return templates.data;
|
||||||
|
} catch {
|
||||||
|
// If custom templates fail (e.g., feature not enabled), return empty array
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
},
|
||||||
|
staleTime: 0,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch official templates from Activepieces cloud
|
||||||
|
const {
|
||||||
|
data: officialTemplates,
|
||||||
|
isLoading: isLoadingOfficial,
|
||||||
|
} = useQuery<Template[], Error>({
|
||||||
|
queryKey: ['templates', TemplateType.OFFICIAL],
|
||||||
|
queryFn: async () => {
|
||||||
|
try {
|
||||||
|
const templates = await templatesApi.list({
|
||||||
|
type: TemplateType.OFFICIAL,
|
||||||
|
});
|
||||||
|
return templates.data;
|
||||||
|
} catch {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
},
|
||||||
|
staleTime: 0,
|
||||||
|
});
|
||||||
|
|
||||||
|
const isLoading = isLoadingCustom || isLoadingOfficial;
|
||||||
|
|
||||||
|
// Combine all templates
|
||||||
|
const allTemplates = [
|
||||||
|
...(customTemplates || []),
|
||||||
|
...(officialTemplates || []),
|
||||||
|
];
|
||||||
|
|
||||||
|
const filteredTemplates = allTemplates.filter((template) => {
|
||||||
|
const templateName = template.name.toLowerCase();
|
||||||
|
const templateDescription = template.description.toLowerCase();
|
||||||
|
return (
|
||||||
|
templateName.includes(search.toLowerCase()) ||
|
||||||
|
templateDescription.includes(search.toLowerCase())
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Separate filtered results by type
|
||||||
|
const filteredCustomTemplates = filteredTemplates.filter(
|
||||||
|
(t) => t.type === TemplateType.CUSTOM,
|
||||||
|
);
|
||||||
|
const filteredOfficialTemplates = filteredTemplates.filter(
|
||||||
|
(t) => t.type === TemplateType.OFFICIAL,
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
customTemplates: customTemplates || [],
|
||||||
|
officialTemplates: officialTemplates || [],
|
||||||
|
allTemplates,
|
||||||
|
filteredTemplates,
|
||||||
|
filteredCustomTemplates,
|
||||||
|
filteredOfficialTemplates,
|
||||||
|
isLoading,
|
||||||
|
search,
|
||||||
|
setSearch,
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|||||||
@@ -57,6 +57,7 @@ export const projectHooks = {
|
|||||||
return useQuery<ProjectWithLimits[], Error>({
|
return useQuery<ProjectWithLimits[], Error>({
|
||||||
queryKey: ['projects', params],
|
queryKey: ['projects', params],
|
||||||
queryFn: async () => {
|
queryFn: async () => {
|
||||||
|
try {
|
||||||
const results = await projectApi.list({
|
const results = await projectApi.list({
|
||||||
cursor,
|
cursor,
|
||||||
limit,
|
limit,
|
||||||
@@ -64,8 +65,13 @@ export const projectHooks = {
|
|||||||
...restParams,
|
...restParams,
|
||||||
});
|
});
|
||||||
return results.data;
|
return results.data;
|
||||||
|
} catch {
|
||||||
|
// Return empty array if endpoint doesn't exist (embedded mode)
|
||||||
|
return [];
|
||||||
|
}
|
||||||
},
|
},
|
||||||
enabled: !displayName || displayName.length > 0,
|
enabled: !displayName || displayName.length > 0,
|
||||||
|
retry: false,
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
useProjectsInfinite: (limit = 20) => {
|
useProjectsInfinite: (limit = 20) => {
|
||||||
@@ -77,11 +83,18 @@ export const projectHooks = {
|
|||||||
queryKey: ['projects-infinite', limit],
|
queryKey: ['projects-infinite', limit],
|
||||||
getNextPageParam: (lastPage) => lastPage.next,
|
getNextPageParam: (lastPage) => lastPage.next,
|
||||||
initialPageParam: undefined,
|
initialPageParam: undefined,
|
||||||
queryFn: ({ pageParam }) =>
|
queryFn: async ({ pageParam }) => {
|
||||||
projectApi.list({
|
try {
|
||||||
|
return await projectApi.list({
|
||||||
cursor: pageParam as string | undefined,
|
cursor: pageParam as string | undefined,
|
||||||
limit,
|
limit,
|
||||||
}),
|
});
|
||||||
|
} catch {
|
||||||
|
// Return empty page if endpoint doesn't exist (embedded mode)
|
||||||
|
return { data: [], next: null, previous: null };
|
||||||
|
}
|
||||||
|
},
|
||||||
|
retry: false,
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
useProjectsForPlatforms: () => {
|
useProjectsForPlatforms: () => {
|
||||||
|
|||||||
@@ -247,6 +247,23 @@ export const appConnectionService = (log: FastifyBaseLogger) => ({
|
|||||||
},
|
},
|
||||||
|
|
||||||
async delete(params: DeleteParams): Promise<void> {
|
async delete(params: DeleteParams): Promise<void> {
|
||||||
|
// Check if connection is protected before deleting
|
||||||
|
const connection = await appConnectionsRepo().findOneBy({
|
||||||
|
id: params.id,
|
||||||
|
platformId: params.platformId,
|
||||||
|
scope: params.scope,
|
||||||
|
...(params.projectId ? { projectIds: ArrayContains([params.projectId]) } : {}),
|
||||||
|
})
|
||||||
|
|
||||||
|
if (connection?.metadata?.protected) {
|
||||||
|
throw new ActivepiecesError({
|
||||||
|
code: ErrorCode.VALIDATION,
|
||||||
|
params: {
|
||||||
|
message: 'This connection is protected and cannot be deleted. It is required for SmoothSchedule integration.',
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
await appConnectionsRepo().delete({
|
await appConnectionsRepo().delete({
|
||||||
id: params.id,
|
id: params.id,
|
||||||
platformId: params.platformId,
|
platformId: params.platformId,
|
||||||
|
|||||||
@@ -65,8 +65,8 @@ export function generateTheme({
|
|||||||
|
|
||||||
export const defaultTheme = generateTheme({
|
export const defaultTheme = generateTheme({
|
||||||
primaryColor: '#6e41e2',
|
primaryColor: '#6e41e2',
|
||||||
websiteName: 'Activepieces',
|
websiteName: 'Automation Builder',
|
||||||
fullLogoUrl: 'https://cdn.activepieces.com/brand/full-logo.png',
|
fullLogoUrl: 'https://smoothschedule.nyc3.digitaloceanspaces.com/static/images/automation-builder-logo-light.svg',
|
||||||
favIconUrl: 'https://cdn.activepieces.com/brand/favicon.ico',
|
favIconUrl: 'https://cdn.activepieces.com/brand/favicon.ico',
|
||||||
logoIconUrl: 'https://cdn.activepieces.com/brand/logo.svg',
|
logoIconUrl: 'https://cdn.activepieces.com/brand/logo.svg',
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -37,6 +37,19 @@ import { pieceListUtils } from './utils'
|
|||||||
|
|
||||||
export const pieceRepos = repoFactory(PieceMetadataEntity)
|
export const pieceRepos = repoFactory(PieceMetadataEntity)
|
||||||
|
|
||||||
|
// Map of old/renamed piece names to their current names
|
||||||
|
// This allows templates with old piece references to still work
|
||||||
|
const PIECE_NAME_ALIASES: Record<string, string> = {
|
||||||
|
'@activepieces/piece-text-ai': '@activepieces/piece-ai',
|
||||||
|
'@activepieces/piece-utility-ai': '@activepieces/piece-ai',
|
||||||
|
'@activepieces/piece-image-ai': '@activepieces/piece-ai',
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cache for dev pieces to avoid reading from disk on every request
|
||||||
|
let devPiecesCache: PieceMetadataSchema[] | null = null
|
||||||
|
let devPiecesCacheTime: number = 0
|
||||||
|
const DEV_PIECES_CACHE_TTL_MS = 60000 // 1 minute cache
|
||||||
|
|
||||||
export const pieceMetadataService = (log: FastifyBaseLogger) => {
|
export const pieceMetadataService = (log: FastifyBaseLogger) => {
|
||||||
return {
|
return {
|
||||||
async setup(): Promise<void> {
|
async setup(): Promise<void> {
|
||||||
@@ -89,13 +102,35 @@ export const pieceMetadataService = (log: FastifyBaseLogger) => {
|
|||||||
release: undefined,
|
release: undefined,
|
||||||
log,
|
log,
|
||||||
})
|
})
|
||||||
const piece = originalPieces.find((piece) => {
|
let piece = originalPieces.find((piece) => {
|
||||||
const strictlyLessThan = (isNil(versionToSearch) || (
|
const strictlyLessThan = (isNil(versionToSearch) || (
|
||||||
semVer.compare(piece.version, versionToSearch.nextExcludedVersion) < 0
|
semVer.compare(piece.version, versionToSearch.nextExcludedVersion) < 0
|
||||||
&& semVer.compare(piece.version, versionToSearch.baseVersion) >= 0
|
&& semVer.compare(piece.version, versionToSearch.baseVersion) >= 0
|
||||||
))
|
))
|
||||||
return piece.name === name && strictlyLessThan
|
return piece.name === name && strictlyLessThan
|
||||||
})
|
})
|
||||||
|
|
||||||
|
// Fall back to latest version if specific version not found
|
||||||
|
// This allows templates with old piece versions to still work
|
||||||
|
if (isNil(piece) && !isNil(version)) {
|
||||||
|
piece = originalPieces.find((p) => p.name === name)
|
||||||
|
if (!isNil(piece)) {
|
||||||
|
log.info(`Piece ${name} version ${version} not found, falling back to latest version ${piece.version}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try piece name alias if piece still not found
|
||||||
|
// This handles renamed pieces (e.g., piece-text-ai -> piece-ai)
|
||||||
|
if (isNil(piece)) {
|
||||||
|
const aliasedName = PIECE_NAME_ALIASES[name]
|
||||||
|
if (!isNil(aliasedName)) {
|
||||||
|
piece = originalPieces.find((p) => p.name === aliasedName)
|
||||||
|
if (!isNil(piece)) {
|
||||||
|
log.info(`Piece ${name} not found, using alias ${aliasedName} (version ${piece.version})`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const isFiltered = !isNil(piece) && await enterpriseFilteringUtils.isFiltered({
|
const isFiltered = !isNil(piece) && await enterpriseFilteringUtils.isFiltered({
|
||||||
piece,
|
piece,
|
||||||
projectId,
|
projectId,
|
||||||
@@ -287,10 +322,20 @@ const loadDevPiecesIfEnabled = async (log: FastifyBaseLogger): Promise<PieceMeta
|
|||||||
if (isNil(devPiecesConfig) || isEmpty(devPiecesConfig)) {
|
if (isNil(devPiecesConfig) || isEmpty(devPiecesConfig)) {
|
||||||
return []
|
return []
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Check if cache is still valid
|
||||||
|
const now = Date.now()
|
||||||
|
if (!isNil(devPiecesCache) && (now - devPiecesCacheTime) < DEV_PIECES_CACHE_TTL_MS) {
|
||||||
|
log.debug(`Using cached dev pieces (${devPiecesCache.length} pieces, age: ${now - devPiecesCacheTime}ms)`)
|
||||||
|
return devPiecesCache
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cache expired or doesn't exist, load from disk
|
||||||
|
log.info('Loading dev pieces from disk (cache expired or empty)')
|
||||||
const piecesNames = devPiecesConfig.split(',')
|
const piecesNames = devPiecesConfig.split(',')
|
||||||
const pieces = await filePiecesUtils(log).loadDistPiecesMetadata(piecesNames)
|
const pieces = await filePiecesUtils(log).loadDistPiecesMetadata(piecesNames)
|
||||||
|
|
||||||
return pieces.map((p): PieceMetadataSchema => ({
|
const result = pieces.map((p): PieceMetadataSchema => ({
|
||||||
id: apId(),
|
id: apId(),
|
||||||
...p,
|
...p,
|
||||||
projectUsage: 0,
|
projectUsage: 0,
|
||||||
@@ -299,6 +344,13 @@ const loadDevPiecesIfEnabled = async (log: FastifyBaseLogger): Promise<PieceMeta
|
|||||||
created: new Date().toISOString(),
|
created: new Date().toISOString(),
|
||||||
updated: new Date().toISOString(),
|
updated: new Date().toISOString(),
|
||||||
}))
|
}))
|
||||||
|
|
||||||
|
// Update cache
|
||||||
|
devPiecesCache = result
|
||||||
|
devPiecesCacheTime = now
|
||||||
|
log.info(`Cached ${result.length} dev pieces`)
|
||||||
|
|
||||||
|
return result
|
||||||
}
|
}
|
||||||
|
|
||||||
const findOldestCreatedDate = async ({ name, platformId }: { name: string, platformId?: string }): Promise<string> => {
|
const findOldestCreatedDate = async ({ name, platformId }: { name: string, platformId?: string }): Promise<string> => {
|
||||||
|
|||||||
@@ -25,6 +25,29 @@ export const communityTemplates = {
|
|||||||
const templates = await response.json()
|
const templates = await response.json()
|
||||||
return templates
|
return templates
|
||||||
},
|
},
|
||||||
|
getById: async (id: string): Promise<Template | null> => {
|
||||||
|
const templateSource = system.get(AppSystemProp.TEMPLATES_SOURCE_URL)
|
||||||
|
if (isNil(templateSource)) {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
// Fetch the template by ID from the cloud templates endpoint
|
||||||
|
const url = `${templateSource}/${id}`
|
||||||
|
try {
|
||||||
|
const response = await fetch(url, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
})
|
||||||
|
if (!response.ok) {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
return await response.json()
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -27,6 +27,14 @@ const edition = system.getEdition()
|
|||||||
|
|
||||||
export const templateController: FastifyPluginAsyncTypebox = async (app) => {
|
export const templateController: FastifyPluginAsyncTypebox = async (app) => {
|
||||||
app.get('/:id', GetParams, async (request) => {
|
app.get('/:id', GetParams, async (request) => {
|
||||||
|
// For community edition, try to fetch from cloud templates first
|
||||||
|
if (edition !== ApEdition.CLOUD) {
|
||||||
|
const cloudTemplate = await communityTemplates.getById(request.params.id)
|
||||||
|
if (!isNil(cloudTemplate)) {
|
||||||
|
return cloudTemplate
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Fall back to local database
|
||||||
return templateService().getOneOrThrow({ id: request.params.id })
|
return templateService().getOneOrThrow({ id: request.params.id })
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|||||||
168
activepieces-fork/publish-pieces.sh
Normal file
168
activepieces-fork/publish-pieces.sh
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
# Publish custom pieces to Verdaccio and register metadata in database
|
||||||
|
# This script runs on container startup
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
VERDACCIO_URL="${VERDACCIO_URL:-http://verdaccio:4873}"
|
||||||
|
PIECES_DIR="/usr/src/app/dist/packages/pieces/community"
|
||||||
|
CUSTOM_PIECES="smoothschedule python-code ruby-code interfaces"
|
||||||
|
|
||||||
|
# Wait for Verdaccio to be ready
|
||||||
|
wait_for_verdaccio() {
|
||||||
|
echo "Waiting for Verdaccio to be ready..."
|
||||||
|
max_attempts=30
|
||||||
|
attempt=0
|
||||||
|
while [ $attempt -lt $max_attempts ]; do
|
||||||
|
if curl -sf "$VERDACCIO_URL/-/ping" > /dev/null 2>&1; then
|
||||||
|
echo "Verdaccio is ready!"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
attempt=$((attempt + 1))
|
||||||
|
echo "Attempt $attempt/$max_attempts - Verdaccio not ready yet..."
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
echo "Warning: Verdaccio not available after $max_attempts attempts"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Configure npm/bun to use Verdaccio with authentication
|
||||||
|
configure_registry() {
|
||||||
|
echo "Configuring npm registry to use Verdaccio..."
|
||||||
|
|
||||||
|
# Register user with Verdaccio first
|
||||||
|
echo "Registering npm user with Verdaccio..."
|
||||||
|
RESPONSE=$(curl -sf -X PUT "$VERDACCIO_URL/-/user/org.couchdb.user:publisher" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"name":"publisher","password":"publisher","email":"publisher@smoothschedule.com"}' 2>&1) || true
|
||||||
|
echo "Registration response: $RESPONSE"
|
||||||
|
|
||||||
|
# Extract token from response if available
|
||||||
|
TOKEN=$(echo "$RESPONSE" | node -pe "JSON.parse(require('fs').readFileSync('/dev/stdin').toString()).token" 2>/dev/null || echo "")
|
||||||
|
|
||||||
|
if [ -n "$TOKEN" ] && [ "$TOKEN" != "undefined" ]; then
|
||||||
|
echo "Using token from registration"
|
||||||
|
cat > ~/.npmrc << EOF
|
||||||
|
registry=$VERDACCIO_URL
|
||||||
|
//verdaccio:4873/:_authToken=$TOKEN
|
||||||
|
EOF
|
||||||
|
else
|
||||||
|
echo "Using basic auth"
|
||||||
|
# Use legacy _auth format (base64 of username:password)
|
||||||
|
AUTH=$(echo -n "publisher:publisher" | base64)
|
||||||
|
cat > ~/.npmrc << EOF
|
||||||
|
registry=$VERDACCIO_URL
|
||||||
|
//verdaccio:4873/:_auth=$AUTH
|
||||||
|
always-auth=true
|
||||||
|
EOF
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create bunfig.toml for bun
|
||||||
|
mkdir -p ~/.bun
|
||||||
|
cat > ~/.bun/bunfig.toml << EOF
|
||||||
|
[install]
|
||||||
|
registry = "$VERDACCIO_URL"
|
||||||
|
EOF
|
||||||
|
echo "Registry configured: $VERDACCIO_URL"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Publish a piece to Verdaccio
|
||||||
|
publish_piece() {
|
||||||
|
piece_name=$1
|
||||||
|
piece_dir="$PIECES_DIR/$piece_name"
|
||||||
|
|
||||||
|
if [ ! -d "$piece_dir" ]; then
|
||||||
|
echo "Warning: Piece directory not found: $piece_dir"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd "$piece_dir"
|
||||||
|
|
||||||
|
# Get package name and version
|
||||||
|
pkg_name=$(node -p "require('./package.json').name")
|
||||||
|
pkg_version=$(node -p "require('./package.json').version")
|
||||||
|
|
||||||
|
echo "Publishing $pkg_name@$pkg_version to Verdaccio..."
|
||||||
|
|
||||||
|
# Check if already published
|
||||||
|
if npm view "$pkg_name@$pkg_version" --registry "$VERDACCIO_URL" > /dev/null 2>&1; then
|
||||||
|
echo " $pkg_name@$pkg_version already published, skipping..."
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Publish to Verdaccio (--force to allow republishing)
|
||||||
|
if npm publish --registry "$VERDACCIO_URL" 2>&1; then
|
||||||
|
echo " Successfully published $pkg_name@$pkg_version"
|
||||||
|
else
|
||||||
|
echo " Warning: Could not publish $pkg_name (may already exist)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd /usr/src/app
|
||||||
|
}
|
||||||
|
|
||||||
|
# Insert piece metadata into database
|
||||||
|
insert_metadata() {
|
||||||
|
if [ -z "$AP_POSTGRES_HOST" ] || [ -z "$AP_POSTGRES_DATABASE" ]; then
|
||||||
|
echo "Warning: Database configuration not available, skipping metadata insertion"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Inserting custom piece metadata into database..."
|
||||||
|
echo " Host: $AP_POSTGRES_HOST"
|
||||||
|
echo " Database: $AP_POSTGRES_DATABASE"
|
||||||
|
echo " User: $AP_POSTGRES_USERNAME"
|
||||||
|
|
||||||
|
# Wait for PostgreSQL to be ready
|
||||||
|
max_attempts=30
|
||||||
|
attempt=0
|
||||||
|
while [ $attempt -lt $max_attempts ]; do
|
||||||
|
if PGPASSWORD="$AP_POSTGRES_PASSWORD" psql -h "$AP_POSTGRES_HOST" -p "${AP_POSTGRES_PORT:-5432}" -U "$AP_POSTGRES_USERNAME" -d "$AP_POSTGRES_DATABASE" -c "SELECT 1" > /dev/null 2>&1; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
attempt=$((attempt + 1))
|
||||||
|
echo "Waiting for PostgreSQL... ($attempt/$max_attempts)"
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ $attempt -eq $max_attempts ]; then
|
||||||
|
echo "Warning: PostgreSQL not available, skipping metadata insertion"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Run the SQL file
|
||||||
|
PGPASSWORD="$AP_POSTGRES_PASSWORD" psql -h "$AP_POSTGRES_HOST" -p "${AP_POSTGRES_PORT:-5432}" -U "$AP_POSTGRES_USERNAME" -d "$AP_POSTGRES_DATABASE" -f /usr/src/app/custom-pieces-metadata.sql
|
||||||
|
|
||||||
|
echo "Piece metadata inserted successfully!"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main execution
|
||||||
|
main() {
|
||||||
|
echo "============================================"
|
||||||
|
echo "Custom Pieces Registration"
|
||||||
|
echo "============================================"
|
||||||
|
|
||||||
|
# Check if Verdaccio is configured and available
|
||||||
|
if [ -n "$VERDACCIO_URL" ] && [ "$VERDACCIO_URL" != "none" ]; then
|
||||||
|
if wait_for_verdaccio; then
|
||||||
|
configure_registry
|
||||||
|
|
||||||
|
# Publish each custom piece
|
||||||
|
for piece in $CUSTOM_PIECES; do
|
||||||
|
publish_piece "$piece" || true
|
||||||
|
done
|
||||||
|
else
|
||||||
|
echo "Skipping Verdaccio publishing - pieces are pre-built in image"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
echo "Verdaccio not configured - using pre-built pieces from image"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Insert metadata into database
|
||||||
|
insert_metadata || true
|
||||||
|
|
||||||
|
echo "============================================"
|
||||||
|
echo "Custom Pieces Registration Complete"
|
||||||
|
echo "============================================"
|
||||||
|
}
|
||||||
|
|
||||||
|
main "$@"
|
||||||
153
deploy.sh
153
deploy.sh
@@ -1,15 +1,33 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
# ==============================================================================
|
||||||
# SmoothSchedule Production Deployment Script
|
# SmoothSchedule Production Deployment Script
|
||||||
# Usage: ./deploy.sh [server_user@server_host] [services...]
|
# ==============================================================================
|
||||||
# Example: ./deploy.sh poduck@smoothschedule.com # Build all
|
|
||||||
# Example: ./deploy.sh poduck@smoothschedule.com traefik # Build only traefik
|
|
||||||
# Example: ./deploy.sh poduck@smoothschedule.com django nginx # Build django and nginx
|
|
||||||
#
|
#
|
||||||
# Available services: django, traefik, nginx, postgres, celeryworker, celerybeat, flower, awscli
|
# Usage: ./deploy.sh [server] [options] [services...]
|
||||||
# Use --no-migrate to skip migrations (useful for config-only changes like traefik)
|
|
||||||
#
|
#
|
||||||
# This script deploys from git repository, not local files.
|
# Examples:
|
||||||
# Changes must be committed and pushed before deploying.
|
# ./deploy.sh # Deploy all services
|
||||||
|
# ./deploy.sh --no-migrate # Deploy without migrations
|
||||||
|
# ./deploy.sh django nginx # Deploy specific services
|
||||||
|
# ./deploy.sh --deploy-ap # Build & deploy Activepieces image
|
||||||
|
# ./deploy.sh poduck@server.com # Deploy to custom server
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# --no-migrate Skip database migrations
|
||||||
|
# --deploy-ap Build Activepieces image locally and transfer to server
|
||||||
|
#
|
||||||
|
# Available services:
|
||||||
|
# django, traefik, nginx, postgres, celeryworker, celerybeat, flower, awscli, activepieces
|
||||||
|
#
|
||||||
|
# IMPORTANT: Activepieces Image
|
||||||
|
# -----------------------------
|
||||||
|
# The production server cannot build the Activepieces image (requires 4GB+ RAM).
|
||||||
|
# Use --deploy-ap to build locally and transfer, or manually:
|
||||||
|
# ./scripts/build-activepieces.sh deploy
|
||||||
|
#
|
||||||
|
# First-time setup:
|
||||||
|
# Run ./smoothschedule/scripts/init-production.sh on the server
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
@@ -23,12 +41,23 @@ NC='\033[0m' # No Color
|
|||||||
SERVER=""
|
SERVER=""
|
||||||
SERVICES=""
|
SERVICES=""
|
||||||
SKIP_MIGRATE=false
|
SKIP_MIGRATE=false
|
||||||
|
DEPLOY_AP=false
|
||||||
|
|
||||||
for arg in "$@"; do
|
for arg in "$@"; do
|
||||||
if [[ "$arg" == "--no-migrate" ]]; then
|
if [[ "$arg" == "--no-migrate" ]]; then
|
||||||
SKIP_MIGRATE=true
|
SKIP_MIGRATE=true
|
||||||
elif [[ -z "$SERVER" ]]; then
|
elif [[ "$arg" == "--deploy-ap" ]]; then
|
||||||
|
DEPLOY_AP=true
|
||||||
|
elif [[ "$arg" == *"@"* ]]; then
|
||||||
|
# Looks like user@host
|
||||||
SERVER="$arg"
|
SERVER="$arg"
|
||||||
|
elif [[ -z "$SERVER" && ! "$arg" =~ ^- ]]; then
|
||||||
|
# First non-flag argument could be server or service
|
||||||
|
if [[ "$arg" =~ ^(django|traefik|nginx|postgres|celeryworker|celerybeat|flower|awscli|activepieces|redis|verdaccio)$ ]]; then
|
||||||
|
SERVICES="$SERVICES $arg"
|
||||||
|
else
|
||||||
|
SERVER="$arg"
|
||||||
|
fi
|
||||||
else
|
else
|
||||||
SERVICES="$SERVICES $arg"
|
SERVICES="$SERVICES $arg"
|
||||||
fi
|
fi
|
||||||
@@ -38,6 +67,7 @@ SERVER=${SERVER:-"poduck@smoothschedule.com"}
|
|||||||
SERVICES=$(echo "$SERVICES" | xargs) # Trim whitespace
|
SERVICES=$(echo "$SERVICES" | xargs) # Trim whitespace
|
||||||
REPO_URL="https://git.talova.net/poduck/smoothschedule.git"
|
REPO_URL="https://git.talova.net/poduck/smoothschedule.git"
|
||||||
REMOTE_DIR="/home/poduck/smoothschedule"
|
REMOTE_DIR="/home/poduck/smoothschedule"
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
echo -e "${GREEN}==================================="
|
echo -e "${GREEN}==================================="
|
||||||
echo "SmoothSchedule Deployment"
|
echo "SmoothSchedule Deployment"
|
||||||
@@ -51,6 +81,9 @@ fi
|
|||||||
if [[ "$SKIP_MIGRATE" == "true" ]]; then
|
if [[ "$SKIP_MIGRATE" == "true" ]]; then
|
||||||
echo "Migrations: SKIPPED"
|
echo "Migrations: SKIPPED"
|
||||||
fi
|
fi
|
||||||
|
if [[ "$DEPLOY_AP" == "true" ]]; then
|
||||||
|
echo "Activepieces: BUILDING AND DEPLOYING"
|
||||||
|
fi
|
||||||
echo ""
|
echo ""
|
||||||
|
|
||||||
# Function to print status
|
# Function to print status
|
||||||
@@ -94,10 +127,45 @@ fi
|
|||||||
|
|
||||||
print_status "All changes committed and pushed!"
|
print_status "All changes committed and pushed!"
|
||||||
|
|
||||||
# Step 2: Deploy on server
|
# Step 2: Build and deploy Activepieces image (if requested)
|
||||||
print_status "Step 2: Deploying on server..."
|
if [[ "$DEPLOY_AP" == "true" ]]; then
|
||||||
|
print_status "Step 2: Building and deploying Activepieces image..."
|
||||||
|
|
||||||
|
# Check if the build script exists
|
||||||
|
if [[ -f "$SCRIPT_DIR/scripts/build-activepieces.sh" ]]; then
|
||||||
|
"$SCRIPT_DIR/scripts/build-activepieces.sh" deploy "$SERVER"
|
||||||
|
else
|
||||||
|
print_warning "Build script not found, building manually..."
|
||||||
|
|
||||||
|
# Build the image
|
||||||
|
print_status "Building Activepieces Docker image locally..."
|
||||||
|
cd "$SCRIPT_DIR/activepieces-fork"
|
||||||
|
docker build -t smoothschedule_production_activepieces .
|
||||||
|
|
||||||
|
# Save and transfer
|
||||||
|
print_status "Transferring image to server..."
|
||||||
|
docker save smoothschedule_production_activepieces | gzip > /tmp/ap-image.tar.gz
|
||||||
|
scp /tmp/ap-image.tar.gz "$SERVER:/tmp/"
|
||||||
|
ssh "$SERVER" "gunzip -c /tmp/ap-image.tar.gz | docker load && rm /tmp/ap-image.tar.gz"
|
||||||
|
rm /tmp/ap-image.tar.gz
|
||||||
|
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
fi
|
||||||
|
|
||||||
|
print_status "Activepieces image deployed!"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Step 3: Deploy on server
|
||||||
|
print_status "Step 3: Deploying on server..."
|
||||||
|
|
||||||
|
# Set SKIP_AP_BUILD if we already deployed activepieces image
|
||||||
|
SKIP_AP_BUILD_VAL="false"
|
||||||
|
if $DEPLOY_AP; then
|
||||||
|
SKIP_AP_BUILD_VAL="true"
|
||||||
|
fi
|
||||||
|
|
||||||
ssh "$SERVER" "bash -s" << ENDSSH
|
ssh "$SERVER" "bash -s" << ENDSSH
|
||||||
|
SKIP_AP_BUILD="$SKIP_AP_BUILD_VAL"
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
echo ">>> Setting up project directory..."
|
echo ">>> Setting up project directory..."
|
||||||
@@ -160,9 +228,16 @@ git log -1 --oneline
|
|||||||
cd smoothschedule
|
cd smoothschedule
|
||||||
|
|
||||||
# Build images (all or specific services)
|
# Build images (all or specific services)
|
||||||
|
# Note: If activepieces was pre-deployed via --deploy-ap, skip rebuilding it
|
||||||
|
# Use COMPOSE_PARALLEL_LIMIT to reduce memory usage on low-memory servers
|
||||||
|
export COMPOSE_PARALLEL_LIMIT=1
|
||||||
if [[ -n "$SERVICES" ]]; then
|
if [[ -n "$SERVICES" ]]; then
|
||||||
echo ">>> Building Docker images: $SERVICES..."
|
echo ">>> Building Docker images: $SERVICES..."
|
||||||
docker compose -f docker-compose.production.yml build $SERVICES
|
docker compose -f docker-compose.production.yml build $SERVICES
|
||||||
|
elif [[ "$SKIP_AP_BUILD" == "true" ]]; then
|
||||||
|
# Skip activepieces build since we pre-built and transferred it
|
||||||
|
echo ">>> Building Docker images (excluding activepieces - pre-built)..."
|
||||||
|
docker compose -f docker-compose.production.yml build django nginx traefik postgres celeryworker celerybeat flower awscli verdaccio
|
||||||
else
|
else
|
||||||
echo ">>> Building all Docker images..."
|
echo ">>> Building all Docker images..."
|
||||||
docker compose -f docker-compose.production.yml build
|
docker compose -f docker-compose.production.yml build
|
||||||
@@ -174,6 +249,61 @@ docker compose -f docker-compose.production.yml up -d
|
|||||||
echo ">>> Waiting for containers to start..."
|
echo ">>> Waiting for containers to start..."
|
||||||
sleep 5
|
sleep 5
|
||||||
|
|
||||||
|
# Setup Activepieces database (if not exists)
|
||||||
|
echo ">>> Setting up Activepieces database..."
|
||||||
|
AP_DB_USER=\$(grep AP_POSTGRES_USERNAME .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
AP_DB_PASS=\$(grep AP_POSTGRES_PASSWORD .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
AP_DB_NAME=\$(grep AP_POSTGRES_DATABASE .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
# Get the Django postgres user from env file (this is the superuser for our DB)
|
||||||
|
DJANGO_DB_USER=\$(grep POSTGRES_USER .envs/.production/.postgres | cut -d= -f2)
|
||||||
|
DJANGO_DB_USER=\${DJANGO_DB_USER:-postgres}
|
||||||
|
|
||||||
|
if [ -n "\$AP_DB_USER" ] && [ -n "\$AP_DB_PASS" ] && [ -n "\$AP_DB_NAME" ]; then
|
||||||
|
# Check if user exists, create if not
|
||||||
|
docker compose -f docker-compose.production.yml exec -T postgres psql -U "\$DJANGO_DB_USER" -d postgres -tc "SELECT 1 FROM pg_roles WHERE rolname='\$AP_DB_USER'" | grep -q 1 || {
|
||||||
|
echo " Creating Activepieces database user..."
|
||||||
|
docker compose -f docker-compose.production.yml exec -T postgres psql -U "\$DJANGO_DB_USER" -d postgres -c "CREATE USER \"\$AP_DB_USER\" WITH PASSWORD '\$AP_DB_PASS';"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if database exists, create if not
|
||||||
|
docker compose -f docker-compose.production.yml exec -T postgres psql -U "\$DJANGO_DB_USER" -d postgres -tc "SELECT 1 FROM pg_database WHERE datname='\$AP_DB_NAME'" | grep -q 1 || {
|
||||||
|
echo " Creating Activepieces database..."
|
||||||
|
docker compose -f docker-compose.production.yml exec -T postgres psql -U "\$DJANGO_DB_USER" -d postgres -c "CREATE DATABASE \$AP_DB_NAME OWNER \"\$AP_DB_USER\";"
|
||||||
|
}
|
||||||
|
echo " Activepieces database ready."
|
||||||
|
else
|
||||||
|
echo " Warning: Could not read Activepieces database config from .envs/.production/.activepieces"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Wait for Activepieces to be ready
|
||||||
|
echo ">>> Waiting for Activepieces to be ready..."
|
||||||
|
for i in {1..30}; do
|
||||||
|
if curl -s http://localhost:80/api/v1/health 2>/dev/null | grep -q "ok"; then
|
||||||
|
echo " Activepieces is ready."
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if [ \$i -eq 30 ]; then
|
||||||
|
echo " Warning: Activepieces health check timed out. It may still be starting."
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
# Check if Activepieces platform exists
|
||||||
|
echo ">>> Checking Activepieces platform..."
|
||||||
|
AP_PLATFORM_ID=\$(grep AP_PLATFORM_ID .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
if [ -z "\$AP_PLATFORM_ID" ] || [ "\$AP_PLATFORM_ID" = "" ]; then
|
||||||
|
echo " WARNING: No AP_PLATFORM_ID configured in .envs/.production/.activepieces"
|
||||||
|
echo " To initialize Activepieces for the first time:"
|
||||||
|
echo " 1. Visit https://automations.smoothschedule.com"
|
||||||
|
echo " 2. Create an admin user (this creates the platform)"
|
||||||
|
echo " 3. Get the platform ID from the response or database"
|
||||||
|
echo " 4. Update AP_PLATFORM_ID in .envs/.production/.activepieces"
|
||||||
|
echo " 5. Also update AP_PLATFORM_ID in .envs/.production/.django"
|
||||||
|
echo " 6. Restart Activepieces: docker compose -f docker-compose.production.yml restart activepieces"
|
||||||
|
else
|
||||||
|
echo " Activepieces platform configured: \$AP_PLATFORM_ID"
|
||||||
|
fi
|
||||||
|
|
||||||
# Run migrations unless skipped
|
# Run migrations unless skipped
|
||||||
if [[ "$SKIP_MIGRATE" != "true" ]]; then
|
if [[ "$SKIP_MIGRATE" != "true" ]]; then
|
||||||
echo ">>> Running database migrations..."
|
echo ">>> Running database migrations..."
|
||||||
@@ -210,6 +340,7 @@ echo "Your application should now be running at:"
|
|||||||
echo " - https://smoothschedule.com"
|
echo " - https://smoothschedule.com"
|
||||||
echo " - https://platform.smoothschedule.com"
|
echo " - https://platform.smoothschedule.com"
|
||||||
echo " - https://*.smoothschedule.com (tenant subdomains)"
|
echo " - https://*.smoothschedule.com (tenant subdomains)"
|
||||||
|
echo " - https://automations.smoothschedule.com (Activepieces)"
|
||||||
echo ""
|
echo ""
|
||||||
echo "To view logs:"
|
echo "To view logs:"
|
||||||
echo " ssh $SERVER 'cd ~/smoothschedule/smoothschedule && docker compose -f docker-compose.production.yml logs -f'"
|
echo " ssh $SERVER 'cd ~/smoothschedule/smoothschedule && docker compose -f docker-compose.production.yml logs -f'"
|
||||||
|
|||||||
@@ -0,0 +1,71 @@
|
|||||||
|
# Page snapshot
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- generic [ref=e3]:
|
||||||
|
- generic [ref=e7]:
|
||||||
|
- link "Smooth Schedule" [ref=e9] [cursor=pointer]:
|
||||||
|
- /url: /
|
||||||
|
- img [ref=e10]
|
||||||
|
- generic [ref=e16]: Smooth Schedule
|
||||||
|
- generic [ref=e17]:
|
||||||
|
- heading "Orchestrate your business with precision." [level=1] [ref=e18]
|
||||||
|
- paragraph [ref=e19]: The all-in-one scheduling platform for businesses of all sizes. Manage resources, staff, and bookings effortlessly.
|
||||||
|
- generic [ref=e24]: © 2025 Smooth Schedule Inc.
|
||||||
|
- generic [ref=e26]:
|
||||||
|
- generic [ref=e27]:
|
||||||
|
- heading "Welcome back" [level=2] [ref=e28]
|
||||||
|
- paragraph [ref=e29]: Please enter your email and password to sign in.
|
||||||
|
- generic [ref=e31]:
|
||||||
|
- img [ref=e33]
|
||||||
|
- generic [ref=e35]:
|
||||||
|
- heading "Authentication Error" [level=3] [ref=e36]
|
||||||
|
- generic [ref=e37]: Invalid credentials
|
||||||
|
- generic [ref=e38]:
|
||||||
|
- generic [ref=e39]:
|
||||||
|
- generic [ref=e40]:
|
||||||
|
- generic [ref=e41]: Email
|
||||||
|
- generic [ref=e42]:
|
||||||
|
- generic:
|
||||||
|
- img
|
||||||
|
- textbox "Email" [ref=e43]:
|
||||||
|
- /placeholder: Enter your email
|
||||||
|
- text: owner@demo.com
|
||||||
|
- generic [ref=e44]:
|
||||||
|
- generic [ref=e45]: Password
|
||||||
|
- generic [ref=e46]:
|
||||||
|
- generic:
|
||||||
|
- img
|
||||||
|
- textbox "Password" [ref=e47]:
|
||||||
|
- /placeholder: ••••••••
|
||||||
|
- text: demopass123
|
||||||
|
- button "Sign in" [ref=e48]:
|
||||||
|
- generic [ref=e49]:
|
||||||
|
- text: Sign in
|
||||||
|
- img [ref=e50]
|
||||||
|
- generic [ref=e57]: Or continue with
|
||||||
|
- button "🇺🇸 English" [ref=e60]:
|
||||||
|
- img [ref=e61]
|
||||||
|
- generic [ref=e64]: 🇺🇸
|
||||||
|
- generic [ref=e65]: English
|
||||||
|
- img [ref=e66]
|
||||||
|
- generic [ref=e68]:
|
||||||
|
- heading "🔓 Quick Login (Dev Only)" [level=3] [ref=e70]:
|
||||||
|
- generic [ref=e71]: 🔓
|
||||||
|
- generic [ref=e72]: Quick Login (Dev Only)
|
||||||
|
- generic [ref=e73]:
|
||||||
|
- button "Business Owner TENANT_OWNER" [ref=e74]:
|
||||||
|
- generic [ref=e75]:
|
||||||
|
- generic [ref=e76]: Business Owner
|
||||||
|
- generic [ref=e77]: TENANT_OWNER
|
||||||
|
- button "Staff (Full Access) TENANT_STAFF" [ref=e78]:
|
||||||
|
- generic [ref=e79]:
|
||||||
|
- generic [ref=e80]: Staff (Full Access)
|
||||||
|
- generic [ref=e81]: TENANT_STAFF
|
||||||
|
- button "Staff (Limited) TENANT_STAFF" [ref=e82]:
|
||||||
|
- generic [ref=e83]:
|
||||||
|
- generic [ref=e84]: Staff (Limited)
|
||||||
|
- generic [ref=e85]: TENANT_STAFF
|
||||||
|
- generic [ref=e86]:
|
||||||
|
- text: "Password for all:"
|
||||||
|
- code [ref=e87]: test123
|
||||||
|
```
|
||||||
Binary file not shown.
|
After Width: | Height: | Size: 446 KiB |
File diff suppressed because one or more lines are too long
@@ -10,7 +10,7 @@ import { useCurrentUser, useMasquerade, useLogout } from './hooks/useAuth';
|
|||||||
import { useCurrentBusiness } from './hooks/useBusiness';
|
import { useCurrentBusiness } from './hooks/useBusiness';
|
||||||
import { useUpdateBusiness } from './hooks/useBusiness';
|
import { useUpdateBusiness } from './hooks/useBusiness';
|
||||||
import { usePlanFeatures } from './hooks/usePlanFeatures';
|
import { usePlanFeatures } from './hooks/usePlanFeatures';
|
||||||
import { setCookie } from './utils/cookies';
|
import { setCookie, deleteCookie } from './utils/cookies';
|
||||||
|
|
||||||
// Import Login Page
|
// Import Login Page
|
||||||
const LoginPage = React.lazy(() => import('./pages/LoginPage'));
|
const LoginPage = React.lazy(() => import('./pages/LoginPage'));
|
||||||
@@ -321,9 +321,37 @@ const AppContent: React.FC = () => {
|
|||||||
return hostname === 'localhost' || hostname === '127.0.0.1' || parts.length === 2;
|
return hostname === 'localhost' || hostname === '127.0.0.1' || parts.length === 2;
|
||||||
};
|
};
|
||||||
|
|
||||||
// On root domain, ALWAYS show marketing site (even if logged in)
|
// On root domain, handle logged-in users appropriately
|
||||||
// Logged-in users will see a "Go to Dashboard" link in the navbar
|
|
||||||
if (isRootDomain()) {
|
if (isRootDomain()) {
|
||||||
|
// If user is logged in as a business user (owner, staff, resource), redirect to their tenant dashboard
|
||||||
|
if (user) {
|
||||||
|
const isBusinessUserOnRoot = ['owner', 'staff', 'resource'].includes(user.role);
|
||||||
|
const isCustomerOnRoot = user.role === 'customer';
|
||||||
|
const hostname = window.location.hostname;
|
||||||
|
const parts = hostname.split('.');
|
||||||
|
const baseDomain = parts.length >= 2 ? parts.slice(-2).join('.') : hostname;
|
||||||
|
const port = window.location.port ? `:${window.location.port}` : '';
|
||||||
|
const protocol = window.location.protocol;
|
||||||
|
|
||||||
|
// Business users on root domain: redirect to their tenant dashboard
|
||||||
|
if (isBusinessUserOnRoot && user.business_subdomain) {
|
||||||
|
window.location.href = `${protocol}//${user.business_subdomain}.${baseDomain}${port}/dashboard`;
|
||||||
|
return <LoadingScreen />;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Customers on root domain: log them out and show the form
|
||||||
|
// Customers should only access their business subdomain
|
||||||
|
if (isCustomerOnRoot) {
|
||||||
|
deleteCookie('access_token');
|
||||||
|
deleteCookie('refresh_token');
|
||||||
|
localStorage.removeItem('masquerade_stack');
|
||||||
|
// Don't redirect, just let them see the page as unauthenticated
|
||||||
|
window.location.reload();
|
||||||
|
return <LoadingScreen />;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Show marketing site for unauthenticated users and platform users (who should use platform subdomain)
|
||||||
return (
|
return (
|
||||||
<Suspense fallback={<LoadingScreen />}>
|
<Suspense fallback={<LoadingScreen />}>
|
||||||
<Routes>
|
<Routes>
|
||||||
@@ -463,6 +491,16 @@ const AppContent: React.FC = () => {
|
|||||||
return <LoadingScreen />;
|
return <LoadingScreen />;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// RULE: Non-platform users on platform subdomain should have their session cleared
|
||||||
|
// This handles cases where masquerading changed tokens to a business user
|
||||||
|
if (!isPlatformUser && isPlatformDomain) {
|
||||||
|
deleteCookie('access_token');
|
||||||
|
deleteCookie('refresh_token');
|
||||||
|
localStorage.removeItem('masquerade_stack');
|
||||||
|
window.location.href = '/platform/login';
|
||||||
|
return <LoadingScreen />;
|
||||||
|
}
|
||||||
|
|
||||||
// RULE: Business users must be on their own business subdomain
|
// RULE: Business users must be on their own business subdomain
|
||||||
if (isBusinessUser && isBusinessSubdomain && user.business_subdomain && user.business_subdomain !== currentSubdomain) {
|
if (isBusinessUser && isBusinessSubdomain && user.business_subdomain && user.business_subdomain !== currentSubdomain) {
|
||||||
const port = window.location.port ? `:${window.location.port}` : '';
|
const port = window.location.port ? `:${window.location.port}` : '';
|
||||||
@@ -470,16 +508,23 @@ const AppContent: React.FC = () => {
|
|||||||
return <LoadingScreen />;
|
return <LoadingScreen />;
|
||||||
}
|
}
|
||||||
|
|
||||||
// RULE: Customers must be on their business subdomain
|
// RULE: Customers must only access their own business subdomain
|
||||||
if (isCustomer && isPlatformDomain && user.business_subdomain) {
|
// If on platform domain or wrong business subdomain, log them out and let them use the form
|
||||||
const port = window.location.port ? `:${window.location.port}` : '';
|
if (isCustomer && isPlatformDomain) {
|
||||||
window.location.href = `${protocol}//${user.business_subdomain}.${baseDomain}${port}/`;
|
deleteCookie('access_token');
|
||||||
|
deleteCookie('refresh_token');
|
||||||
|
localStorage.removeItem('masquerade_stack');
|
||||||
|
window.location.reload();
|
||||||
return <LoadingScreen />;
|
return <LoadingScreen />;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (isCustomer && isBusinessSubdomain && user.business_subdomain && user.business_subdomain !== currentSubdomain) {
|
if (isCustomer && isBusinessSubdomain && user.business_subdomain && user.business_subdomain !== currentSubdomain) {
|
||||||
const port = window.location.port ? `:${window.location.port}` : '';
|
// Customer is on a different business's subdomain - log them out
|
||||||
window.location.href = `${protocol}//${user.business_subdomain}.${baseDomain}${port}/`;
|
// They might be trying to book with a different business
|
||||||
|
deleteCookie('access_token');
|
||||||
|
deleteCookie('refresh_token');
|
||||||
|
localStorage.removeItem('masquerade_stack');
|
||||||
|
window.location.reload();
|
||||||
return <LoadingScreen />;
|
return <LoadingScreen />;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -713,7 +758,8 @@ const AppContent: React.FC = () => {
|
|||||||
<Route path="/" element={<PublicPage />} />
|
<Route path="/" element={<PublicPage />} />
|
||||||
<Route path="/book" element={<BookingFlow />} />
|
<Route path="/book" element={<BookingFlow />} />
|
||||||
<Route path="/embed" element={<EmbedBooking />} />
|
<Route path="/embed" element={<EmbedBooking />} />
|
||||||
<Route path="/login" element={<LoginPage />} />
|
{/* Logged-in business users on their own subdomain get redirected to dashboard */}
|
||||||
|
<Route path="/login" element={<Navigate to="/dashboard" replace />} />
|
||||||
<Route path="/sign/:token" element={<ContractSigning />} />
|
<Route path="/sign/:token" element={<ContractSigning />} />
|
||||||
|
|
||||||
{/* Dashboard routes inside BusinessLayout */}
|
{/* Dashboard routes inside BusinessLayout */}
|
||||||
|
|||||||
36
frontend/src/api/activepieces.ts
Normal file
36
frontend/src/api/activepieces.ts
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
import api from './client';
|
||||||
|
|
||||||
|
export interface DefaultFlow {
|
||||||
|
flow_type: string;
|
||||||
|
display_name: string;
|
||||||
|
activepieces_flow_id: string | null;
|
||||||
|
is_modified: boolean;
|
||||||
|
is_enabled: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RestoreFlowResponse {
|
||||||
|
success: boolean;
|
||||||
|
flow_type: string;
|
||||||
|
message: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RestoreAllResponse {
|
||||||
|
success: boolean;
|
||||||
|
restored: string[];
|
||||||
|
failed: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export const getDefaultFlows = async (): Promise<DefaultFlow[]> => {
|
||||||
|
const response = await api.get('/activepieces/default-flows/');
|
||||||
|
return response.data.flows;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const restoreFlow = async (flowType: string): Promise<RestoreFlowResponse> => {
|
||||||
|
const response = await api.post(`/activepieces/default-flows/${flowType}/restore/`);
|
||||||
|
return response.data;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const restoreAllFlows = async (): Promise<RestoreAllResponse> => {
|
||||||
|
const response = await api.post('/activepieces/default-flows/restore-all/');
|
||||||
|
return response.data;
|
||||||
|
};
|
||||||
@@ -291,6 +291,7 @@ const Sidebar: React.FC<SidebarProps> = ({ business, user, isCollapsed, toggleCo
|
|||||||
label={t('nav.automations', 'Automations')}
|
label={t('nav.automations', 'Automations')}
|
||||||
isCollapsed={isCollapsed}
|
isCollapsed={isCollapsed}
|
||||||
locked={!canUse('automations')}
|
locked={!canUse('automations')}
|
||||||
|
badgeElement={<UnfinishedBadge />}
|
||||||
/>
|
/>
|
||||||
</SidebarSection>
|
</SidebarSection>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -1,13 +1,12 @@
|
|||||||
import React, { useState } from 'react';
|
import React, { useState } from 'react';
|
||||||
import { motion, AnimatePresence } from 'framer-motion';
|
import { motion, AnimatePresence } from 'framer-motion';
|
||||||
import { Mail, Calendar, Bell, ArrowRight, Zap, CheckCircle2, Code, LayoutGrid } from 'lucide-react';
|
import { Mail, Calendar, Bell, ArrowRight, Zap, CheckCircle2 } from 'lucide-react';
|
||||||
import { useTranslation } from 'react-i18next';
|
import { useTranslation } from 'react-i18next';
|
||||||
import CodeBlock from './CodeBlock';
|
import WorkflowVisual from './WorkflowVisual';
|
||||||
|
|
||||||
const AutomationShowcase: React.FC = () => {
|
const AutomationShowcase: React.FC = () => {
|
||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
const [activeTab, setActiveTab] = useState(0);
|
const [activeTab, setActiveTab] = useState(0);
|
||||||
const [viewMode, setViewMode] = useState<'marketplace' | 'code'>('marketplace');
|
|
||||||
|
|
||||||
const examples = [
|
const examples = [
|
||||||
{
|
{
|
||||||
@@ -15,37 +14,40 @@ const AutomationShowcase: React.FC = () => {
|
|||||||
icon: Mail,
|
icon: Mail,
|
||||||
title: t('marketing.plugins.examples.winback.title'),
|
title: t('marketing.plugins.examples.winback.title'),
|
||||||
description: t('marketing.plugins.examples.winback.description'),
|
description: t('marketing.plugins.examples.winback.description'),
|
||||||
stats: [t('marketing.plugins.examples.winback.stats.retention'), t('marketing.plugins.examples.winback.stats.revenue')],
|
stats: [
|
||||||
marketplaceImage: 'bg-gradient-to-br from-pink-500 to-rose-500',
|
t('marketing.plugins.examples.winback.stats.retention'),
|
||||||
code: t('marketing.plugins.examples.winback.code'),
|
t('marketing.plugins.examples.winback.stats.revenue'),
|
||||||
|
],
|
||||||
|
variant: 'winback' as const,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
id: 'noshow',
|
id: 'noshow',
|
||||||
icon: Bell,
|
icon: Bell,
|
||||||
title: t('marketing.plugins.examples.noshow.title'),
|
title: t('marketing.plugins.examples.noshow.title'),
|
||||||
description: t('marketing.plugins.examples.noshow.description'),
|
description: t('marketing.plugins.examples.noshow.description'),
|
||||||
stats: [t('marketing.plugins.examples.noshow.stats.reduction'), t('marketing.plugins.examples.noshow.stats.utilization')],
|
stats: [
|
||||||
marketplaceImage: 'bg-gradient-to-br from-blue-500 to-cyan-500',
|
t('marketing.plugins.examples.noshow.stats.reduction'),
|
||||||
code: t('marketing.plugins.examples.noshow.code'),
|
t('marketing.plugins.examples.noshow.stats.utilization'),
|
||||||
|
],
|
||||||
|
variant: 'noshow' as const,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
id: 'report',
|
id: 'report',
|
||||||
icon: Calendar,
|
icon: Calendar,
|
||||||
title: t('marketing.plugins.examples.report.title'),
|
title: t('marketing.plugins.examples.report.title'),
|
||||||
description: t('marketing.plugins.examples.report.description'),
|
description: t('marketing.plugins.examples.report.description'),
|
||||||
stats: [t('marketing.plugins.examples.report.stats.timeSaved'), t('marketing.plugins.examples.report.stats.visibility')],
|
stats: [
|
||||||
marketplaceImage: 'bg-gradient-to-br from-purple-500 to-indigo-500',
|
t('marketing.plugins.examples.report.stats.timeSaved'),
|
||||||
code: t('marketing.plugins.examples.report.code'),
|
t('marketing.plugins.examples.report.stats.visibility'),
|
||||||
|
],
|
||||||
|
variant: 'report' as const,
|
||||||
},
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
const CurrentIcon = examples[activeTab].icon;
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<section className="py-24 bg-gray-50 dark:bg-gray-900 overflow-hidden">
|
<section className="py-24 bg-gray-50 dark:bg-gray-900 overflow-hidden">
|
||||||
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
|
||||||
<div className="grid lg:grid-cols-2 gap-16 items-center">
|
<div className="grid lg:grid-cols-2 gap-16 items-center">
|
||||||
|
|
||||||
{/* Left Column: Content */}
|
{/* Left Column: Content */}
|
||||||
<div>
|
<div>
|
||||||
<div className="inline-flex items-center gap-2 px-3 py-1 rounded-full bg-brand-100 dark:bg-brand-900/30 text-brand-600 dark:text-brand-400 text-sm font-medium mb-6">
|
<div className="inline-flex items-center gap-2 px-3 py-1 rounded-full bg-brand-100 dark:bg-brand-900/30 text-brand-600 dark:text-brand-400 text-sm font-medium mb-6">
|
||||||
@@ -66,21 +68,30 @@ const AutomationShowcase: React.FC = () => {
|
|||||||
<button
|
<button
|
||||||
key={example.id}
|
key={example.id}
|
||||||
onClick={() => setActiveTab(index)}
|
onClick={() => setActiveTab(index)}
|
||||||
className={`w-full text-left p-4 rounded-xl transition-all duration-200 border ${activeTab === index
|
className={`w-full text-left p-4 rounded-xl transition-all duration-200 border ${
|
||||||
|
activeTab === index
|
||||||
? 'bg-white dark:bg-gray-800 border-brand-500 shadow-lg scale-[1.02]'
|
? 'bg-white dark:bg-gray-800 border-brand-500 shadow-lg scale-[1.02]'
|
||||||
: 'bg-transparent border-transparent hover:bg-white/50 dark:hover:bg-gray-800/50'
|
: 'bg-transparent border-transparent hover:bg-white/50 dark:hover:bg-gray-800/50'
|
||||||
}`}
|
}`}
|
||||||
>
|
>
|
||||||
<div className="flex items-start gap-4">
|
<div className="flex items-start gap-4">
|
||||||
<div className={`p-2 rounded-lg ${activeTab === index
|
<div
|
||||||
|
className={`p-2 rounded-lg ${
|
||||||
|
activeTab === index
|
||||||
? 'bg-brand-100 text-brand-600 dark:bg-brand-900/50 dark:text-brand-400'
|
? 'bg-brand-100 text-brand-600 dark:bg-brand-900/50 dark:text-brand-400'
|
||||||
: 'bg-gray-100 text-gray-500 dark:bg-gray-800 dark:text-gray-400'
|
: 'bg-gray-100 text-gray-500 dark:bg-gray-800 dark:text-gray-400'
|
||||||
}`}>
|
}`}
|
||||||
|
>
|
||||||
<example.icon className="w-6 h-6" />
|
<example.icon className="w-6 h-6" />
|
||||||
</div>
|
</div>
|
||||||
<div>
|
<div>
|
||||||
<h3 className={`font-semibold mb-1 ${activeTab === index ? 'text-gray-900 dark:text-white' : 'text-gray-600 dark:text-gray-400'
|
<h3
|
||||||
}`}>
|
className={`font-semibold mb-1 ${
|
||||||
|
activeTab === index
|
||||||
|
? 'text-gray-900 dark:text-white'
|
||||||
|
: 'text-gray-600 dark:text-gray-400'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
{example.title}
|
{example.title}
|
||||||
</h3>
|
</h3>
|
||||||
<p className="text-sm text-gray-500 dark:text-gray-500">
|
<p className="text-sm text-gray-500 dark:text-gray-500">
|
||||||
@@ -93,101 +104,54 @@ const AutomationShowcase: React.FC = () => {
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Right Column: Visuals */}
|
{/* Right Column: Workflow Visual */}
|
||||||
<div className="relative">
|
<div className="relative">
|
||||||
{/* Background Decor */}
|
{/* Background Decor */}
|
||||||
<div className="absolute -inset-4 bg-gradient-to-r from-brand-500/20 to-purple-500/20 rounded-3xl blur-2xl opacity-50" />
|
<div className="absolute -inset-4 bg-gradient-to-r from-brand-500/20 to-purple-500/20 rounded-3xl blur-2xl opacity-50" />
|
||||||
|
|
||||||
{/* View Toggle */}
|
|
||||||
<div className="absolute -top-12 right-0 flex bg-gray-100 dark:bg-gray-800 p-1 rounded-lg border border-gray-200 dark:border-gray-700">
|
|
||||||
<button
|
|
||||||
onClick={() => setViewMode('marketplace')}
|
|
||||||
className={`flex items-center gap-2 px-3 py-1.5 rounded-md text-sm font-medium transition-all ${viewMode === 'marketplace'
|
|
||||||
? 'bg-white dark:bg-gray-700 text-gray-900 dark:text-white shadow-sm'
|
|
||||||
: 'text-gray-500 hover:text-gray-700 dark:hover:text-gray-300'
|
|
||||||
}`}
|
|
||||||
>
|
|
||||||
<LayoutGrid className="w-4 h-4" />
|
|
||||||
{t('marketing.plugins.viewToggle.marketplace')}
|
|
||||||
</button>
|
|
||||||
<button
|
|
||||||
onClick={() => setViewMode('code')}
|
|
||||||
className={`flex items-center gap-2 px-3 py-1.5 rounded-md text-sm font-medium transition-all ${viewMode === 'code'
|
|
||||||
? 'bg-white dark:bg-gray-700 text-gray-900 dark:text-white shadow-sm'
|
|
||||||
: 'text-gray-500 hover:text-gray-700 dark:hover:text-gray-300'
|
|
||||||
}`}
|
|
||||||
>
|
|
||||||
<Code className="w-4 h-4" />
|
|
||||||
{t('marketing.plugins.viewToggle.developer')}
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<AnimatePresence mode="wait">
|
<AnimatePresence mode="wait">
|
||||||
<motion.div
|
<motion.div
|
||||||
key={`${activeTab}-${viewMode}`}
|
key={activeTab}
|
||||||
initial={{ opacity: 0, y: 20 }}
|
initial={{ opacity: 0, y: 20 }}
|
||||||
animate={{ opacity: 1, y: 0 }}
|
animate={{ opacity: 1, y: 0 }}
|
||||||
exit={{ opacity: 0, y: -20 }}
|
exit={{ opacity: 0, y: -20 }}
|
||||||
transition={{ duration: 0.3 }}
|
transition={{ duration: 0.3 }}
|
||||||
className="relative mt-8" // Added margin top for toggle
|
className="relative"
|
||||||
>
|
>
|
||||||
{/* Stats Cards */}
|
{/* Stats Cards */}
|
||||||
<div className="flex gap-4 mb-6">
|
<div className="flex gap-4 mb-6">
|
||||||
{examples[activeTab].stats.map((stat, i) => (
|
{examples[activeTab].stats.map((stat, i) => (
|
||||||
<div key={i} className="flex items-center gap-2 px-4 py-2 bg-white dark:bg-gray-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700">
|
<div
|
||||||
|
key={i}
|
||||||
|
className="flex items-center gap-2 px-4 py-2 bg-white dark:bg-gray-800 rounded-lg shadow-sm border border-gray-200 dark:border-gray-700"
|
||||||
|
>
|
||||||
<CheckCircle2 className="w-4 h-4 text-green-500" />
|
<CheckCircle2 className="w-4 h-4 text-green-500" />
|
||||||
<span className="text-sm font-medium text-gray-900 dark:text-white">{stat}</span>
|
<span className="text-sm font-medium text-gray-900 dark:text-white">
|
||||||
|
{stat}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{viewMode === 'marketplace' ? (
|
{/* Workflow Visual */}
|
||||||
// Marketplace Card View
|
<WorkflowVisual
|
||||||
<div className="bg-white dark:bg-gray-800 rounded-xl border border-gray-200 dark:border-gray-700 shadow-xl overflow-hidden">
|
variant={examples[activeTab].variant}
|
||||||
<div className={`h-32 ${examples[activeTab].marketplaceImage} flex items-center justify-center`}>
|
trigger={examples[activeTab].title}
|
||||||
<CurrentIcon className="w-16 h-16 text-white opacity-90" />
|
actions={[]}
|
||||||
</div>
|
|
||||||
<div className="p-6">
|
|
||||||
<div className="flex justify-between items-start mb-4">
|
|
||||||
<div>
|
|
||||||
<h3 className="text-xl font-bold text-gray-900 dark:text-white">{examples[activeTab].title}</h3>
|
|
||||||
<div className="text-sm text-gray-500">{t('marketing.plugins.marketplaceCard.author')}</div>
|
|
||||||
</div>
|
|
||||||
<button className="px-4 py-2 bg-brand-600 text-white rounded-lg font-medium text-sm hover:bg-brand-700 transition-colors">
|
|
||||||
{t('marketing.plugins.marketplaceCard.installButton')}
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<p className="text-gray-600 dark:text-gray-300 mb-6">
|
|
||||||
{examples[activeTab].description}
|
|
||||||
</p>
|
|
||||||
<div className="flex items-center gap-2 text-sm text-gray-500">
|
|
||||||
<div className="flex -space-x-2">
|
|
||||||
{[1, 2, 3].map(i => (
|
|
||||||
<div key={i} className="w-6 h-6 rounded-full bg-gray-300 border-2 border-white dark:border-gray-800" />
|
|
||||||
))}
|
|
||||||
</div>
|
|
||||||
<span>{t('marketing.plugins.marketplaceCard.usedBy')}</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
) : (
|
|
||||||
// Code View
|
|
||||||
<CodeBlock
|
|
||||||
code={examples[activeTab].code}
|
|
||||||
filename={`${examples[activeTab].id}_automation.py`}
|
|
||||||
/>
|
/>
|
||||||
)}
|
|
||||||
|
|
||||||
{/* CTA */}
|
{/* CTA */}
|
||||||
<div className="mt-6 text-right">
|
<div className="mt-6 text-right">
|
||||||
<a href="/features" className="inline-flex items-center gap-2 text-brand-600 dark:text-brand-400 font-medium hover:underline">
|
<a
|
||||||
|
href="/features"
|
||||||
|
className="inline-flex items-center gap-2 text-brand-600 dark:text-brand-400 font-medium hover:underline"
|
||||||
|
>
|
||||||
{t('marketing.plugins.cta')} <ArrowRight className="w-4 h-4" />
|
{t('marketing.plugins.cta')} <ArrowRight className="w-4 h-4" />
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
</motion.div>
|
</motion.div>
|
||||||
</AnimatePresence>
|
</AnimatePresence>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</section>
|
</section>
|
||||||
|
|||||||
171
frontend/src/components/marketing/WorkflowVisual.tsx
Normal file
171
frontend/src/components/marketing/WorkflowVisual.tsx
Normal file
@@ -0,0 +1,171 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { motion } from 'framer-motion';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import {
|
||||||
|
Calendar,
|
||||||
|
Mail,
|
||||||
|
MessageSquare,
|
||||||
|
Clock,
|
||||||
|
Search,
|
||||||
|
FileText,
|
||||||
|
Sparkles,
|
||||||
|
ChevronRight,
|
||||||
|
} from 'lucide-react';
|
||||||
|
import type { LucideIcon } from 'lucide-react';
|
||||||
|
|
||||||
|
interface WorkflowBlock {
|
||||||
|
icon: LucideIcon;
|
||||||
|
label: string;
|
||||||
|
type: 'trigger' | 'action';
|
||||||
|
}
|
||||||
|
|
||||||
|
interface WorkflowVisualProps {
|
||||||
|
trigger: string;
|
||||||
|
actions: string[];
|
||||||
|
variant?: 'winback' | 'noshow' | 'report';
|
||||||
|
}
|
||||||
|
|
||||||
|
const getWorkflowConfig = (
|
||||||
|
variant: WorkflowVisualProps['variant']
|
||||||
|
): WorkflowBlock[] => {
|
||||||
|
switch (variant) {
|
||||||
|
case 'winback':
|
||||||
|
return [
|
||||||
|
{ icon: Clock, label: 'Schedule: Weekly', type: 'trigger' },
|
||||||
|
{ icon: Search, label: 'Find Inactive Customers', type: 'action' },
|
||||||
|
{ icon: Mail, label: 'Send Email', type: 'action' },
|
||||||
|
];
|
||||||
|
case 'noshow':
|
||||||
|
return [
|
||||||
|
{ icon: Calendar, label: 'Event Created', type: 'trigger' },
|
||||||
|
{ icon: Clock, label: 'Wait 2 Hours Before', type: 'action' },
|
||||||
|
{ icon: MessageSquare, label: 'Send SMS', type: 'action' },
|
||||||
|
];
|
||||||
|
case 'report':
|
||||||
|
return [
|
||||||
|
{ icon: Clock, label: 'Daily at 6 PM', type: 'trigger' },
|
||||||
|
{ icon: FileText, label: "Get Tomorrow's Schedule", type: 'action' },
|
||||||
|
{ icon: Mail, label: 'Send Summary', type: 'action' },
|
||||||
|
];
|
||||||
|
default:
|
||||||
|
return [
|
||||||
|
{ icon: Calendar, label: 'Event Created', type: 'trigger' },
|
||||||
|
{ icon: Clock, label: 'Wait', type: 'action' },
|
||||||
|
{ icon: Mail, label: 'Send Notification', type: 'action' },
|
||||||
|
];
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const WorkflowVisual: React.FC<WorkflowVisualProps> = ({
|
||||||
|
variant = 'noshow',
|
||||||
|
}) => {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const blocks = getWorkflowConfig(variant);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="bg-white dark:bg-gray-800 rounded-xl border border-gray-200 dark:border-gray-700 shadow-xl overflow-hidden">
|
||||||
|
{/* AI Copilot Input */}
|
||||||
|
<div className="p-4 bg-gradient-to-r from-purple-50 to-brand-50 dark:from-purple-900/20 dark:to-brand-900/20 border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<div className="flex items-center gap-3 bg-white dark:bg-gray-900 rounded-lg border border-gray-200 dark:border-gray-600 px-4 py-3 shadow-sm">
|
||||||
|
<Sparkles className="w-5 h-5 text-purple-500" />
|
||||||
|
<span className="text-gray-400 dark:text-gray-500 text-sm flex-1">
|
||||||
|
{t('marketing.plugins.aiCopilot.placeholder')}
|
||||||
|
</span>
|
||||||
|
<motion.div
|
||||||
|
animate={{ opacity: [0.5, 1, 0.5] }}
|
||||||
|
transition={{ duration: 1.5, repeat: Infinity }}
|
||||||
|
className="w-2 h-5 bg-purple-500 rounded-sm"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<p className="text-xs text-gray-500 dark:text-gray-400 mt-2 ml-1">
|
||||||
|
{t('marketing.plugins.aiCopilot.examples')}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Workflow Visualization */}
|
||||||
|
<div className="p-6">
|
||||||
|
<div className="flex flex-col gap-3">
|
||||||
|
{blocks.map((block, index) => (
|
||||||
|
<React.Fragment key={index}>
|
||||||
|
{/* Block */}
|
||||||
|
<motion.div
|
||||||
|
initial={{ opacity: 0, x: -20 }}
|
||||||
|
animate={{ opacity: 1, x: 0 }}
|
||||||
|
transition={{ delay: index * 0.15 }}
|
||||||
|
className={`flex items-center gap-3 p-3 rounded-lg border ${
|
||||||
|
block.type === 'trigger'
|
||||||
|
? 'bg-gradient-to-r from-brand-50 to-purple-50 dark:from-brand-900/30 dark:to-purple-900/30 border-brand-200 dark:border-brand-800'
|
||||||
|
: 'bg-gray-50 dark:bg-gray-900/50 border-gray-200 dark:border-gray-700'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
className={`p-2 rounded-lg ${
|
||||||
|
block.type === 'trigger'
|
||||||
|
? 'bg-brand-100 dark:bg-brand-900/50 text-brand-600 dark:text-brand-400'
|
||||||
|
: 'bg-gray-100 dark:bg-gray-800 text-gray-600 dark:text-gray-400'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
<block.icon className="w-5 h-5" />
|
||||||
|
</div>
|
||||||
|
<div className="flex-1">
|
||||||
|
<span
|
||||||
|
className={`text-xs font-medium uppercase tracking-wide ${
|
||||||
|
block.type === 'trigger'
|
||||||
|
? 'text-brand-600 dark:text-brand-400'
|
||||||
|
: 'text-gray-500 dark:text-gray-400'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{block.type === 'trigger' ? 'When' : 'Then'}
|
||||||
|
</span>
|
||||||
|
<p className="text-sm font-medium text-gray-900 dark:text-white">
|
||||||
|
{block.label}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<ChevronRight className="w-4 h-4 text-gray-400" />
|
||||||
|
</motion.div>
|
||||||
|
|
||||||
|
{/* Connector */}
|
||||||
|
{index < blocks.length - 1 && (
|
||||||
|
<div className="flex items-center justify-center h-4">
|
||||||
|
<div className="relative w-0.5 h-full bg-gray-200 dark:bg-gray-700">
|
||||||
|
<motion.div
|
||||||
|
className="absolute w-2 h-2 bg-brand-500 rounded-full left-1/2 -translate-x-1/2"
|
||||||
|
animate={{ y: [0, 12, 0] }}
|
||||||
|
transition={{
|
||||||
|
duration: 1,
|
||||||
|
repeat: Infinity,
|
||||||
|
delay: index * 0.3,
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</React.Fragment>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Integration badges */}
|
||||||
|
<div className="mt-6 pt-4 border-t border-gray-200 dark:border-gray-700">
|
||||||
|
<p className="text-xs text-gray-500 dark:text-gray-400 mb-2">
|
||||||
|
{t('marketing.plugins.integrations.description')}
|
||||||
|
</p>
|
||||||
|
<div className="flex gap-2 flex-wrap">
|
||||||
|
{['Gmail', 'Slack', 'Sheets', 'Twilio'].map((app) => (
|
||||||
|
<span
|
||||||
|
key={app}
|
||||||
|
className="px-2 py-1 text-xs bg-gray-100 dark:bg-gray-800 text-gray-600 dark:text-gray-400 rounded-md"
|
||||||
|
>
|
||||||
|
{app}
|
||||||
|
</span>
|
||||||
|
))}
|
||||||
|
<span className="px-2 py-1 text-xs text-gray-400 dark:text-gray-500">
|
||||||
|
+1000 more
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default WorkflowVisual;
|
||||||
35
frontend/src/hooks/useActivepieces.ts
Normal file
35
frontend/src/hooks/useActivepieces.ts
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
||||||
|
import * as activepiecesApi from '../api/activepieces';
|
||||||
|
|
||||||
|
export const activepiecesKeys = {
|
||||||
|
all: ['activepieces'] as const,
|
||||||
|
defaultFlows: () => [...activepiecesKeys.all, 'defaultFlows'] as const,
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useDefaultFlows = () => {
|
||||||
|
return useQuery({
|
||||||
|
queryKey: activepiecesKeys.defaultFlows(),
|
||||||
|
queryFn: activepiecesApi.getDefaultFlows,
|
||||||
|
staleTime: 30 * 1000,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useRestoreFlow = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: activepiecesApi.restoreFlow,
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: activepiecesKeys.defaultFlows() });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useRestoreAllFlows = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: activepiecesApi.restoreAllFlows,
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: activepiecesKeys.defaultFlows() });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
@@ -2444,14 +2444,14 @@
|
|||||||
"pageTitle": "Built for Developers, Designed for Business",
|
"pageTitle": "Built for Developers, Designed for Business",
|
||||||
"pageSubtitle": "SmoothSchedule isn't just cloud software. It's a programmable platform that adapts to your unique business logic.",
|
"pageSubtitle": "SmoothSchedule isn't just cloud software. It's a programmable platform that adapts to your unique business logic.",
|
||||||
"automationEngine": {
|
"automationEngine": {
|
||||||
"badge": "Automation Engine",
|
"badge": "AI-Powered Automation",
|
||||||
"title": "Automated Task Manager",
|
"title": "Visual Workflow Builder with AI Copilot",
|
||||||
"description": "Most schedulers only book appointments. SmoothSchedule runs your business. Our \"Automated Task Manager\" executes internal tasks without blocking your calendar.",
|
"description": "Most schedulers only book appointments. SmoothSchedule runs your business. Create powerful automations with our visual builder or just describe what you want.",
|
||||||
"features": {
|
"features": {
|
||||||
"recurringJobs": "Run recurring jobs (e.g., \"Every Monday at 9am\")",
|
"visualBuilder": "Visual drag-and-drop workflow builder",
|
||||||
"customLogic": "Execute custom logic securely",
|
"aiCopilot": "AI Copilot creates flows from natural language",
|
||||||
"fullContext": "Access full customer and event context",
|
"integrations": "Connect to 1000+ apps (Gmail, Slack, Sheets, etc.)",
|
||||||
"zeroInfrastructure": "Zero infrastructure management"
|
"templates": "Pre-built templates for common automations"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"multiTenancy": {
|
"multiTenancy": {
|
||||||
@@ -2558,7 +2558,7 @@
|
|||||||
"0": "Unlimited Users",
|
"0": "Unlimited Users",
|
||||||
"1": "Unlimited Appointments",
|
"1": "Unlimited Appointments",
|
||||||
"2": "Unlimited Automations",
|
"2": "Unlimited Automations",
|
||||||
"3": "Custom Python Scripts",
|
"3": "AI-Powered Workflow Builder",
|
||||||
"4": "Custom Domain (White-Label)",
|
"4": "Custom Domain (White-Label)",
|
||||||
"5": "Dedicated Support",
|
"5": "Dedicated Support",
|
||||||
"6": "API Access"
|
"6": "API Access"
|
||||||
@@ -2638,9 +2638,9 @@
|
|||||||
},
|
},
|
||||||
"faq": {
|
"faq": {
|
||||||
"title": "Frequently Asked Questions",
|
"title": "Frequently Asked Questions",
|
||||||
"needPython": {
|
"needCoding": {
|
||||||
"question": "Do I need to know Python to use SmoothSchedule?",
|
"question": "Do I need to know how to code to create automations?",
|
||||||
"answer": "Not at all! You can use our pre-built plugins from the marketplace for common tasks like email reminders and reports. Python is only needed if you want to write custom scripts."
|
"answer": "Not at all! Our visual workflow builder lets you create automations by dragging and dropping blocks. Even better, just describe what you want in plain English and our AI Copilot will build the workflow for you."
|
||||||
},
|
},
|
||||||
"exceedLimits": {
|
"exceedLimits": {
|
||||||
"question": "What happens if I exceed my plan's limits?",
|
"question": "What happens if I exceed my plan's limits?",
|
||||||
@@ -2939,19 +2939,14 @@
|
|||||||
"copyright": "Smooth Schedule Inc. All rights reserved."
|
"copyright": "Smooth Schedule Inc. All rights reserved."
|
||||||
},
|
},
|
||||||
"plugins": {
|
"plugins": {
|
||||||
"badge": "Limitless Automation",
|
"badge": "Visual Automation Builder",
|
||||||
"headline": "Choose from our Marketplace, or build your own.",
|
"headline": "Build automations visually, or just describe what you want.",
|
||||||
"subheadline": "Browse hundreds of pre-built automations to streamline your workflows instantly. Need something custom? Developers can write Python scripts to extend the platform endlessly.",
|
"subheadline": "Create powerful workflows with our drag-and-drop builder. No coding required. Just describe what you want and our AI Copilot will build it for you.",
|
||||||
"viewToggle": {
|
"aiCopilot": {
|
||||||
"marketplace": "Marketplace",
|
"placeholder": "Describe your automation...",
|
||||||
"developer": "Developer"
|
"examples": "e.g., \"Send a reminder 2 hours before each appointment\""
|
||||||
},
|
},
|
||||||
"marketplaceCard": {
|
"cta": "Try the Automation Builder",
|
||||||
"author": "by SmoothSchedule Team",
|
|
||||||
"installButton": "Install Automation",
|
|
||||||
"usedBy": "Used by 1,200+ businesses"
|
|
||||||
},
|
|
||||||
"cta": "Explore the Marketplace",
|
|
||||||
"examples": {
|
"examples": {
|
||||||
"winback": {
|
"winback": {
|
||||||
"title": "Client Win-Back",
|
"title": "Client Win-Back",
|
||||||
@@ -2960,7 +2955,8 @@
|
|||||||
"retention": "+15% Retention",
|
"retention": "+15% Retention",
|
||||||
"revenue": "$4k/mo Revenue"
|
"revenue": "$4k/mo Revenue"
|
||||||
},
|
},
|
||||||
"code": "# Win back lost customers\ndays_inactive = 60\ndiscount = \"20%\"\n\n# Find inactive customers\ninactive = api.get_customers(\n last_visit_lt=days_ago(days_inactive)\n)\n\n# Send personalized offer\nfor customer in inactive:\n api.send_email(\n to=customer.email,\n subject=\"We miss you!\",\n body=f\"Come back for {discount} off!\"\n )"
|
"trigger": "Schedule: Every Monday",
|
||||||
|
"actions": ["Find inactive customers", "Send personalized email"]
|
||||||
},
|
},
|
||||||
"noshow": {
|
"noshow": {
|
||||||
"title": "No-Show Prevention",
|
"title": "No-Show Prevention",
|
||||||
@@ -2969,7 +2965,8 @@
|
|||||||
"reduction": "-40% No-Shows",
|
"reduction": "-40% No-Shows",
|
||||||
"utilization": "Better Utilization"
|
"utilization": "Better Utilization"
|
||||||
},
|
},
|
||||||
"code": "# Prevent no-shows\nhours_before = 2\n\n# Find upcoming appointments\nupcoming = api.get_appointments(\n start_time__within=hours(hours_before)\n)\n\n# Send SMS reminder\nfor appt in upcoming:\n api.send_sms(\n to=appt.customer.phone,\n body=f\"Reminder: Appointment in 2h at {appt.time}\"\n )"
|
"trigger": "Event: Appointment Created",
|
||||||
|
"actions": ["Wait 2 hours before", "Send SMS reminder"]
|
||||||
},
|
},
|
||||||
"report": {
|
"report": {
|
||||||
"title": "Daily Reports",
|
"title": "Daily Reports",
|
||||||
@@ -2978,8 +2975,13 @@
|
|||||||
"timeSaved": "Save 30min/day",
|
"timeSaved": "Save 30min/day",
|
||||||
"visibility": "Full Visibility"
|
"visibility": "Full Visibility"
|
||||||
},
|
},
|
||||||
"code": "# Daily Manager Report\ntomorrow = date.today() + timedelta(days=1)\n\n# Get schedule stats\nstats = api.get_schedule_stats(date=tomorrow)\nrevenue = api.forecast_revenue(date=tomorrow)\n\n# Email manager\napi.send_email(\n to=\"manager@business.com\",\n subject=f\"Schedule for {tomorrow}\",\n body=f\"Bookings: {stats.count}, Est. Rev: ${revenue}\"\n)"
|
"trigger": "Schedule: Daily at 6 PM",
|
||||||
|
"actions": ["Get tomorrow's schedule", "Send email summary"]
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
"integrations": {
|
||||||
|
"title": "Connect to 1000+ Apps",
|
||||||
|
"description": "Gmail, Slack, Google Sheets, and more"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"home": {
|
"home": {
|
||||||
@@ -2993,8 +2995,8 @@
|
|||||||
"description": "Handle complex resources like staff, rooms, and equipment with concurrency limits."
|
"description": "Handle complex resources like staff, rooms, and equipment with concurrency limits."
|
||||||
},
|
},
|
||||||
"automationEngine": {
|
"automationEngine": {
|
||||||
"title": "Automation Engine",
|
"title": "AI-Powered Automations",
|
||||||
"description": "Install automations from our marketplace or build your own to automate tasks."
|
"description": "Build visual workflows with AI assistance. Connect to 1000+ apps with no code."
|
||||||
},
|
},
|
||||||
"multiTenant": {
|
"multiTenant": {
|
||||||
"title": "Enterprise Security",
|
"title": "Enterprise Security",
|
||||||
@@ -3023,7 +3025,7 @@
|
|||||||
},
|
},
|
||||||
"testimonials": {
|
"testimonials": {
|
||||||
"winBack": {
|
"winBack": {
|
||||||
"quote": "I installed the 'Client Win-Back' plugin and recovered $2k in bookings the first week. No setup required.",
|
"quote": "I set up the 'Client Win-Back' automation in 2 minutes using the AI Copilot. Recovered $2k in bookings the first week.",
|
||||||
"author": "Alex Rivera",
|
"author": "Alex Rivera",
|
||||||
"role": "Owner",
|
"role": "Owner",
|
||||||
"company": "TechSalon"
|
"company": "TechSalon"
|
||||||
|
|||||||
@@ -1,10 +1,13 @@
|
|||||||
import { useState, useEffect, useRef, useCallback } from 'react';
|
import { useState, useEffect, useRef, useCallback } from 'react';
|
||||||
import { useTranslation } from 'react-i18next';
|
import { useTranslation } from 'react-i18next';
|
||||||
import { useQuery } from '@tanstack/react-query';
|
import { useQuery } from '@tanstack/react-query';
|
||||||
import { Bot, RefreshCw, AlertTriangle, Loader2, ExternalLink, Sparkles } from 'lucide-react';
|
import { Bot, RefreshCw, AlertTriangle, Loader2, ExternalLink, Sparkles, RotateCcw, ChevronDown } from 'lucide-react';
|
||||||
import api from '../api/client';
|
import api from '../api/client';
|
||||||
import { usePlanFeatures } from '../hooks/usePlanFeatures';
|
import { usePlanFeatures } from '../hooks/usePlanFeatures';
|
||||||
import { LockedSection } from '../components/UpgradePrompt';
|
import { UpgradePrompt } from '../components/UpgradePrompt';
|
||||||
|
import { useDarkMode } from '../hooks/useDarkMode';
|
||||||
|
import { useDefaultFlows, useRestoreFlow, useRestoreAllFlows } from '../hooks/useActivepieces';
|
||||||
|
import ConfirmationModal from '../components/ConfirmationModal';
|
||||||
|
|
||||||
interface ActivepiecesEmbedData {
|
interface ActivepiecesEmbedData {
|
||||||
token: string;
|
token: string;
|
||||||
@@ -40,11 +43,22 @@ const ActivepiecesVendorEventName = {
|
|||||||
*/
|
*/
|
||||||
export default function Automations() {
|
export default function Automations() {
|
||||||
const { t, i18n } = useTranslation();
|
const { t, i18n } = useTranslation();
|
||||||
const { features, loading: featuresLoading } = usePlanFeatures();
|
const { permissions, isLoading: featuresLoading, canUse } = usePlanFeatures();
|
||||||
const iframeRef = useRef<HTMLIFrameElement>(null);
|
const iframeRef = useRef<HTMLIFrameElement>(null);
|
||||||
const [iframeReady, setIframeReady] = useState(false);
|
const [iframeReady, setIframeReady] = useState(false);
|
||||||
const [authenticated, setAuthenticated] = useState(false);
|
const [authenticated, setAuthenticated] = useState(false);
|
||||||
const initSentRef = useRef(false);
|
const initSentRef = useRef(false);
|
||||||
|
const [refreshKey, setRefreshKey] = useState(0);
|
||||||
|
|
||||||
|
// Dark mode support
|
||||||
|
const isDark = useDarkMode();
|
||||||
|
|
||||||
|
// Restore default flows
|
||||||
|
const { data: defaultFlows } = useDefaultFlows();
|
||||||
|
const restoreFlow = useRestoreFlow();
|
||||||
|
const restoreAll = useRestoreAllFlows();
|
||||||
|
const [showFlowsMenu, setShowFlowsMenu] = useState(false);
|
||||||
|
const [confirmRestore, setConfirmRestore] = useState<{ type: 'all' | string; name: string } | null>(null);
|
||||||
|
|
||||||
// Fetch embed token for Activepieces
|
// Fetch embed token for Activepieces
|
||||||
const {
|
const {
|
||||||
@@ -98,12 +112,13 @@ export default function Automations() {
|
|||||||
hidePageHeader: false,
|
hidePageHeader: false,
|
||||||
locale: i18n.language || 'en',
|
locale: i18n.language || 'en',
|
||||||
initialRoute: '/flows', // Start on flows page to show sidebar
|
initialRoute: '/flows', // Start on flows page to show sidebar
|
||||||
|
mode: isDark ? 'dark' : 'light',
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
iframeRef.current.contentWindow.postMessage(initMessage, '*');
|
iframeRef.current.contentWindow.postMessage(initMessage, '*');
|
||||||
initSentRef.current = true;
|
initSentRef.current = true;
|
||||||
}, [embedData?.token, i18n.language]);
|
}, [embedData?.token, i18n.language, isDark]);
|
||||||
|
|
||||||
// Listen for messages from Activepieces iframe
|
// Listen for messages from Activepieces iframe
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
@@ -151,8 +166,27 @@ export default function Automations() {
|
|||||||
}
|
}
|
||||||
}, [embedData?.token]);
|
}, [embedData?.token]);
|
||||||
|
|
||||||
|
// Reset state when theme changes to force iframe reload
|
||||||
|
useEffect(() => {
|
||||||
|
// Reset state so the iframe reinitializes with new theme
|
||||||
|
setIframeReady(false);
|
||||||
|
setAuthenticated(false);
|
||||||
|
initSentRef.current = false;
|
||||||
|
}, [isDark]);
|
||||||
|
|
||||||
|
// Close flows menu when clicking outside
|
||||||
|
useEffect(() => {
|
||||||
|
const handleClickOutside = (event: MouseEvent) => {
|
||||||
|
if (showFlowsMenu && !(event.target as Element).closest('.flows-menu-container')) {
|
||||||
|
setShowFlowsMenu(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
document.addEventListener('mousedown', handleClickOutside);
|
||||||
|
return () => document.removeEventListener('mousedown', handleClickOutside);
|
||||||
|
}, [showFlowsMenu]);
|
||||||
|
|
||||||
// Check feature access
|
// Check feature access
|
||||||
const canAccessAutomations = features?.can_access_automations ?? true;
|
const canAccessAutomations = canUse('automations');
|
||||||
|
|
||||||
// Loading state
|
// Loading state
|
||||||
if (isLoading || featuresLoading) {
|
if (isLoading || featuresLoading) {
|
||||||
@@ -172,14 +206,7 @@ export default function Automations() {
|
|||||||
if (!canAccessAutomations) {
|
if (!canAccessAutomations) {
|
||||||
return (
|
return (
|
||||||
<div className="p-6">
|
<div className="p-6">
|
||||||
<LockedSection
|
<UpgradePrompt feature="automations" variant="banner" />
|
||||||
title={t('automations.locked.title', 'Automations')}
|
|
||||||
description={t(
|
|
||||||
'automations.locked.description',
|
|
||||||
'Upgrade your plan to access powerful workflow automation with AI-powered flow creation.'
|
|
||||||
)}
|
|
||||||
featureName="automations"
|
|
||||||
/>
|
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -212,8 +239,9 @@ export default function Automations() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Build iframe URL - use /embed route for SDK communication
|
// Build iframe URL - use /embed route for SDK communication
|
||||||
|
// Include theme parameter for dark mode support
|
||||||
const iframeSrc = embedData?.embedUrl
|
const iframeSrc = embedData?.embedUrl
|
||||||
? `${embedData.embedUrl}/embed`
|
? `${embedData.embedUrl}/embed?theme=${isDark ? 'dark' : 'light'}`
|
||||||
: '';
|
: '';
|
||||||
|
|
||||||
// Show loading until authenticated
|
// Show loading until authenticated
|
||||||
@@ -221,6 +249,47 @@ export default function Automations() {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="h-full flex flex-col">
|
<div className="h-full flex flex-col">
|
||||||
|
{/* Confirmation Modal */}
|
||||||
|
<ConfirmationModal
|
||||||
|
isOpen={!!confirmRestore}
|
||||||
|
onClose={() => setConfirmRestore(null)}
|
||||||
|
onConfirm={() => {
|
||||||
|
const refreshIframe = () => {
|
||||||
|
// Reset iframe state and increment key to force remount
|
||||||
|
initSentRef.current = false;
|
||||||
|
setAuthenticated(false);
|
||||||
|
setIframeReady(false);
|
||||||
|
setRefreshKey((k) => k + 1);
|
||||||
|
refetch();
|
||||||
|
setConfirmRestore(null);
|
||||||
|
};
|
||||||
|
|
||||||
|
if (confirmRestore?.type === 'all') {
|
||||||
|
restoreAll.mutate(undefined, {
|
||||||
|
onSuccess: refreshIframe,
|
||||||
|
});
|
||||||
|
} else if (confirmRestore) {
|
||||||
|
restoreFlow.mutate(confirmRestore.type, {
|
||||||
|
onSuccess: refreshIframe,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
title={t('automations.restore.title', 'Restore Default Flow')}
|
||||||
|
message={
|
||||||
|
confirmRestore?.type === 'all'
|
||||||
|
? t(
|
||||||
|
'automations.restore.allMessage',
|
||||||
|
'This will restore all default automation flows. Any customizations will be overwritten.'
|
||||||
|
)
|
||||||
|
: t('automations.restore.singleMessage', 'Restore "{{name}}" to its default configuration?', {
|
||||||
|
name: confirmRestore?.name,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
confirmText={t('automations.restore.confirm', 'Restore')}
|
||||||
|
variant="warning"
|
||||||
|
isLoading={restoreFlow.isPending || restoreAll.isPending}
|
||||||
|
/>
|
||||||
|
|
||||||
{/* Header */}
|
{/* Header */}
|
||||||
<div className="bg-white dark:bg-gray-800 border-b border-gray-200 dark:border-gray-700 px-6 py-4">
|
<div className="bg-white dark:bg-gray-800 border-b border-gray-200 dark:border-gray-700 px-6 py-4">
|
||||||
<div className="flex items-center justify-between">
|
<div className="flex items-center justify-between">
|
||||||
@@ -228,17 +297,9 @@ export default function Automations() {
|
|||||||
<div className="p-2 bg-primary-100 dark:bg-primary-900/30 rounded-lg">
|
<div className="p-2 bg-primary-100 dark:bg-primary-900/30 rounded-lg">
|
||||||
<Bot className="h-6 w-6 text-primary-600 dark:text-primary-400" />
|
<Bot className="h-6 w-6 text-primary-600 dark:text-primary-400" />
|
||||||
</div>
|
</div>
|
||||||
<div>
|
|
||||||
<h1 className="text-xl font-semibold text-gray-900 dark:text-white">
|
<h1 className="text-xl font-semibold text-gray-900 dark:text-white">
|
||||||
{t('automations.title', 'Automations')}
|
{t('automations.title', 'Automations')}
|
||||||
</h1>
|
</h1>
|
||||||
<p className="text-sm text-gray-500 dark:text-gray-400">
|
|
||||||
{t(
|
|
||||||
'automations.subtitle',
|
|
||||||
'Build powerful workflows to automate your business'
|
|
||||||
)}
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="flex items-center space-x-3">
|
<div className="flex items-center space-x-3">
|
||||||
@@ -250,12 +311,65 @@ export default function Automations() {
|
|||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{/* Restore Default Flows dropdown */}
|
||||||
|
<div className="relative flows-menu-container">
|
||||||
|
<button
|
||||||
|
onClick={() => setShowFlowsMenu(!showFlowsMenu)}
|
||||||
|
className="flex items-center gap-2 px-3 py-2 text-sm bg-gray-100 hover:bg-gray-200 dark:bg-gray-700 dark:hover:bg-gray-600 rounded-lg transition-colors"
|
||||||
|
>
|
||||||
|
<RotateCcw className="h-4 w-4" />
|
||||||
|
<span className="hidden sm:inline">{t('automations.restoreDefaults', 'Restore Defaults')}</span>
|
||||||
|
<ChevronDown className="h-4 w-4" />
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{showFlowsMenu && (
|
||||||
|
<div className="absolute right-0 mt-2 w-72 bg-white dark:bg-gray-800 rounded-lg shadow-lg border border-gray-200 dark:border-gray-700 z-50">
|
||||||
|
<div className="p-2 border-b border-gray-200 dark:border-gray-700">
|
||||||
|
<button
|
||||||
|
onClick={() => {
|
||||||
|
setConfirmRestore({ type: 'all', name: 'All Flows' });
|
||||||
|
setShowFlowsMenu(false);
|
||||||
|
}}
|
||||||
|
className="w-full px-3 py-2 text-left text-sm font-medium text-primary-600 dark:text-primary-400 hover:bg-primary-50 dark:hover:bg-primary-900/20 rounded transition-colors"
|
||||||
|
>
|
||||||
|
{t('automations.restoreAll', 'Restore All Default Flows')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<div className="max-h-64 overflow-y-auto p-2">
|
||||||
|
{defaultFlows?.map((flow) => (
|
||||||
|
<button
|
||||||
|
key={flow.flow_type}
|
||||||
|
onClick={() => {
|
||||||
|
setConfirmRestore({ type: flow.flow_type, name: flow.display_name });
|
||||||
|
setShowFlowsMenu(false);
|
||||||
|
}}
|
||||||
|
className="w-full px-3 py-2 text-left text-sm hover:bg-gray-100 dark:hover:bg-gray-700 rounded transition-colors flex justify-between items-center"
|
||||||
|
>
|
||||||
|
<span className="text-gray-700 dark:text-gray-300">{flow.display_name}</span>
|
||||||
|
{flow.is_modified && (
|
||||||
|
<span className="text-xs text-orange-500 dark:text-orange-400">
|
||||||
|
{t('automations.modified', 'Modified')}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
{(!defaultFlows || defaultFlows.length === 0) && (
|
||||||
|
<p className="px-3 py-2 text-sm text-gray-500 dark:text-gray-400">
|
||||||
|
{t('automations.noDefaultFlows', 'No default flows available')}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
{/* Refresh button */}
|
{/* Refresh button */}
|
||||||
<button
|
<button
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
initSentRef.current = false;
|
initSentRef.current = false;
|
||||||
setAuthenticated(false);
|
setAuthenticated(false);
|
||||||
setIframeReady(false);
|
setIframeReady(false);
|
||||||
|
setRefreshKey((k) => k + 1);
|
||||||
refetch();
|
refetch();
|
||||||
}}
|
}}
|
||||||
className="p-2 text-gray-500 hover:text-gray-700 dark:text-gray-400 dark:hover:text-gray-200 hover:bg-gray-100 dark:hover:bg-gray-700 rounded-lg transition-colors"
|
className="p-2 text-gray-500 hover:text-gray-700 dark:text-gray-400 dark:hover:text-gray-200 hover:bg-gray-100 dark:hover:bg-gray-700 rounded-lg transition-colors"
|
||||||
@@ -296,6 +410,7 @@ export default function Automations() {
|
|||||||
|
|
||||||
{iframeSrc && (
|
{iframeSrc && (
|
||||||
<iframe
|
<iframe
|
||||||
|
key={`activepieces-${isDark ? 'dark' : 'light'}-${refreshKey}`}
|
||||||
ref={iframeRef}
|
ref={iframeRef}
|
||||||
src={iframeSrc}
|
src={iframeSrc}
|
||||||
className="w-full h-full border-0"
|
className="w-full h-full border-0"
|
||||||
|
|||||||
@@ -10,32 +10,15 @@ import {
|
|||||||
CheckCircle2,
|
CheckCircle2,
|
||||||
FileSignature,
|
FileSignature,
|
||||||
FileCheck,
|
FileCheck,
|
||||||
Scale
|
Scale,
|
||||||
|
Sparkles
|
||||||
} from 'lucide-react';
|
} from 'lucide-react';
|
||||||
import CodeBlock from '../../components/marketing/CodeBlock';
|
import WorkflowVisual from '../../components/marketing/WorkflowVisual';
|
||||||
import CTASection from '../../components/marketing/CTASection';
|
import CTASection from '../../components/marketing/CTASection';
|
||||||
|
|
||||||
const FeaturesPage: React.FC = () => {
|
const FeaturesPage: React.FC = () => {
|
||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
|
|
||||||
const pluginExample = `# Custom Webhook Plugin
|
|
||||||
import requests
|
|
||||||
|
|
||||||
def execute(context):
|
|
||||||
event = context['event']
|
|
||||||
|
|
||||||
# Send data to external CRM
|
|
||||||
response = requests.post(
|
|
||||||
'https://api.crm.com/leads',
|
|
||||||
json={
|
|
||||||
'name': event.customer.name,
|
|
||||||
'email': event.customer.email,
|
|
||||||
'source': 'SmoothSchedule'
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
return response.status_code == 200`;
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="bg-white dark:bg-gray-900 min-h-screen pt-24">
|
<div className="bg-white dark:bg-gray-900 min-h-screen pt-24">
|
||||||
|
|
||||||
@@ -55,7 +38,7 @@ def execute(context):
|
|||||||
<div className="grid lg:grid-cols-2 gap-16 items-center">
|
<div className="grid lg:grid-cols-2 gap-16 items-center">
|
||||||
<div>
|
<div>
|
||||||
<div className="inline-flex items-center gap-2 px-3 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-600 dark:text-purple-400 text-sm font-medium mb-6">
|
<div className="inline-flex items-center gap-2 px-3 py-1 rounded-full bg-purple-100 dark:bg-purple-900/30 text-purple-600 dark:text-purple-400 text-sm font-medium mb-6">
|
||||||
<Zap className="w-4 h-4" />
|
<Sparkles className="w-4 h-4" />
|
||||||
<span>{t('marketing.features.automationEngine.badge')}</span>
|
<span>{t('marketing.features.automationEngine.badge')}</span>
|
||||||
</div>
|
</div>
|
||||||
<h2 className="text-3xl font-bold text-gray-900 dark:text-white mb-6">
|
<h2 className="text-3xl font-bold text-gray-900 dark:text-white mb-6">
|
||||||
@@ -67,10 +50,10 @@ def execute(context):
|
|||||||
|
|
||||||
<ul className="space-y-4">
|
<ul className="space-y-4">
|
||||||
{[
|
{[
|
||||||
t('marketing.features.automationEngine.features.recurringJobs'),
|
t('marketing.features.automationEngine.features.visualBuilder'),
|
||||||
t('marketing.features.automationEngine.features.customLogic'),
|
t('marketing.features.automationEngine.features.aiCopilot'),
|
||||||
t('marketing.features.automationEngine.features.fullContext'),
|
t('marketing.features.automationEngine.features.integrations'),
|
||||||
t('marketing.features.automationEngine.features.zeroInfrastructure')
|
t('marketing.features.automationEngine.features.templates')
|
||||||
].map((item) => (
|
].map((item) => (
|
||||||
<li key={item} className="flex items-center gap-3">
|
<li key={item} className="flex items-center gap-3">
|
||||||
<CheckCircle2 className="w-5 h-5 text-green-500" />
|
<CheckCircle2 className="w-5 h-5 text-green-500" />
|
||||||
@@ -82,7 +65,7 @@ def execute(context):
|
|||||||
|
|
||||||
<div className="relative">
|
<div className="relative">
|
||||||
<div className="absolute -inset-4 bg-purple-500/20 rounded-3xl blur-2xl" />
|
<div className="absolute -inset-4 bg-purple-500/20 rounded-3xl blur-2xl" />
|
||||||
<CodeBlock code={pluginExample} filename="webhook_plugin.py" />
|
<WorkflowVisual variant="noshow" trigger="" actions={[]} />
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,4 +1,8 @@
|
|||||||
{
|
{
|
||||||
"status": "passed",
|
"status": "failed",
|
||||||
"failedTests": []
|
"failedTests": [
|
||||||
|
"48355e96022c09342254-afc6f80c7b0d571cf29c",
|
||||||
|
"48355e96022c09342254-34a31faf9801d1748670",
|
||||||
|
"48355e96022c09342254-b1931f7c2caec15d8c31"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
@@ -0,0 +1,71 @@
|
|||||||
|
# Page snapshot
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- generic [ref=e3]:
|
||||||
|
- generic [ref=e7]:
|
||||||
|
- link "Smooth Schedule" [ref=e9] [cursor=pointer]:
|
||||||
|
- /url: /
|
||||||
|
- img [ref=e10]
|
||||||
|
- generic [ref=e16]: Smooth Schedule
|
||||||
|
- generic [ref=e17]:
|
||||||
|
- heading "Orchestrate your business with precision." [level=1] [ref=e18]
|
||||||
|
- paragraph [ref=e19]: The all-in-one scheduling platform for businesses of all sizes. Manage resources, staff, and bookings effortlessly.
|
||||||
|
- generic [ref=e24]: © 2025 Smooth Schedule Inc.
|
||||||
|
- generic [ref=e26]:
|
||||||
|
- generic [ref=e27]:
|
||||||
|
- heading "Welcome back" [level=2] [ref=e28]
|
||||||
|
- paragraph [ref=e29]: Please enter your email and password to sign in.
|
||||||
|
- generic [ref=e31]:
|
||||||
|
- img [ref=e33]
|
||||||
|
- generic [ref=e35]:
|
||||||
|
- heading "Authentication Error" [level=3] [ref=e36]
|
||||||
|
- generic [ref=e37]: Invalid credentials
|
||||||
|
- generic [ref=e38]:
|
||||||
|
- generic [ref=e39]:
|
||||||
|
- generic [ref=e40]:
|
||||||
|
- generic [ref=e41]: Email
|
||||||
|
- generic [ref=e42]:
|
||||||
|
- generic:
|
||||||
|
- img
|
||||||
|
- textbox "Email" [ref=e43]:
|
||||||
|
- /placeholder: Enter your email
|
||||||
|
- text: owner@demo.com
|
||||||
|
- generic [ref=e44]:
|
||||||
|
- generic [ref=e45]: Password
|
||||||
|
- generic [ref=e46]:
|
||||||
|
- generic:
|
||||||
|
- img
|
||||||
|
- textbox "Password" [ref=e47]:
|
||||||
|
- /placeholder: ••••••••
|
||||||
|
- text: demopass123
|
||||||
|
- button "Sign in" [ref=e48]:
|
||||||
|
- generic [ref=e49]:
|
||||||
|
- text: Sign in
|
||||||
|
- img [ref=e50]
|
||||||
|
- generic [ref=e57]: Or continue with
|
||||||
|
- button "🇺🇸 English" [ref=e60]:
|
||||||
|
- img [ref=e61]
|
||||||
|
- generic [ref=e64]: 🇺🇸
|
||||||
|
- generic [ref=e65]: English
|
||||||
|
- img [ref=e66]
|
||||||
|
- generic [ref=e68]:
|
||||||
|
- heading "🔓 Quick Login (Dev Only)" [level=3] [ref=e70]:
|
||||||
|
- generic [ref=e71]: 🔓
|
||||||
|
- generic [ref=e72]: Quick Login (Dev Only)
|
||||||
|
- generic [ref=e73]:
|
||||||
|
- button "Business Owner TENANT_OWNER" [ref=e74]:
|
||||||
|
- generic [ref=e75]:
|
||||||
|
- generic [ref=e76]: Business Owner
|
||||||
|
- generic [ref=e77]: TENANT_OWNER
|
||||||
|
- button "Staff (Full Access) TENANT_STAFF" [ref=e78]:
|
||||||
|
- generic [ref=e79]:
|
||||||
|
- generic [ref=e80]: Staff (Full Access)
|
||||||
|
- generic [ref=e81]: TENANT_STAFF
|
||||||
|
- button "Staff (Limited) TENANT_STAFF" [ref=e82]:
|
||||||
|
- generic [ref=e83]:
|
||||||
|
- generic [ref=e84]: Staff (Limited)
|
||||||
|
- generic [ref=e85]: TENANT_STAFF
|
||||||
|
- generic [ref=e86]:
|
||||||
|
- text: "Password for all:"
|
||||||
|
- code [ref=e87]: test123
|
||||||
|
```
|
||||||
Binary file not shown.
|
After Width: | Height: | Size: 446 KiB |
115
scripts/build-activepieces.sh
Executable file
115
scripts/build-activepieces.sh
Executable file
@@ -0,0 +1,115 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# ==============================================================================
|
||||||
|
# Build and Deploy Activepieces Docker Image
|
||||||
|
#
|
||||||
|
# This script builds the Activepieces image locally and optionally
|
||||||
|
# transfers it to the production server.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./scripts/build-activepieces.sh # Build only
|
||||||
|
# ./scripts/build-activepieces.sh deploy # Build and deploy to server
|
||||||
|
# ./scripts/build-activepieces.sh deploy user@server # Custom server
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
PROJECT_ROOT="$(dirname "$SCRIPT_DIR")"
|
||||||
|
AP_DIR="$PROJECT_ROOT/activepieces-fork"
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
RED='\033[0;31m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
print_status() { echo -e "${GREEN}>>> $1${NC}"; }
|
||||||
|
print_warning() { echo -e "${YELLOW}>>> $1${NC}"; }
|
||||||
|
print_error() { echo -e "${RED}>>> $1${NC}"; }
|
||||||
|
|
||||||
|
# Parse arguments
|
||||||
|
ACTION="${1:-build}"
|
||||||
|
SERVER="${2:-poduck@smoothschedule.com}"
|
||||||
|
|
||||||
|
IMAGE_NAME="smoothschedule_production_activepieces"
|
||||||
|
TEMP_FILE="/tmp/activepieces-image.tar.gz"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "==========================================="
|
||||||
|
echo " Activepieces Docker Image Builder"
|
||||||
|
echo "==========================================="
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check we have the activepieces-fork directory
|
||||||
|
if [ ! -d "$AP_DIR" ]; then
|
||||||
|
print_error "activepieces-fork directory not found at: $AP_DIR"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Build the image
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Building Activepieces Docker image..."
|
||||||
|
print_warning "This may take 5-10 minutes and requires 4GB+ RAM"
|
||||||
|
|
||||||
|
cd "$AP_DIR"
|
||||||
|
|
||||||
|
# Build with progress output
|
||||||
|
docker build \
|
||||||
|
--progress=plain \
|
||||||
|
-t "$IMAGE_NAME" \
|
||||||
|
.
|
||||||
|
|
||||||
|
print_status "Build complete!"
|
||||||
|
|
||||||
|
# Show image size
|
||||||
|
docker images "$IMAGE_NAME" --format "Image size: {{.Size}}"
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Deploy to server (if requested)
|
||||||
|
# ==============================================================================
|
||||||
|
if [ "$ACTION" = "deploy" ]; then
|
||||||
|
echo ""
|
||||||
|
print_status "Preparing to deploy to: $SERVER"
|
||||||
|
|
||||||
|
# Save image to compressed archive
|
||||||
|
print_status "Saving image to $TEMP_FILE..."
|
||||||
|
docker save "$IMAGE_NAME" | gzip > "$TEMP_FILE"
|
||||||
|
|
||||||
|
# Show file size
|
||||||
|
ls -lh "$TEMP_FILE" | awk '{print "Archive size: " $5}'
|
||||||
|
|
||||||
|
# Transfer to server
|
||||||
|
print_status "Transferring to server (this may take a few minutes)..."
|
||||||
|
scp "$TEMP_FILE" "$SERVER:/tmp/activepieces-image.tar.gz"
|
||||||
|
|
||||||
|
# Load on server
|
||||||
|
print_status "Loading image on server..."
|
||||||
|
ssh "$SERVER" "gunzip -c /tmp/activepieces-image.tar.gz | docker load && rm /tmp/activepieces-image.tar.gz"
|
||||||
|
|
||||||
|
# Restart Activepieces on server
|
||||||
|
print_status "Restarting Activepieces on server..."
|
||||||
|
ssh "$SERVER" "cd ~/smoothschedule/smoothschedule && docker compose -f docker-compose.production.yml up -d activepieces"
|
||||||
|
|
||||||
|
# Clean up local temp file
|
||||||
|
rm -f "$TEMP_FILE"
|
||||||
|
|
||||||
|
print_status "Deployment complete!"
|
||||||
|
echo ""
|
||||||
|
echo "Activepieces should now be running with the new image."
|
||||||
|
echo "Check status with:"
|
||||||
|
echo " ssh $SERVER 'cd ~/smoothschedule/smoothschedule && docker compose -f docker-compose.production.yml ps activepieces'"
|
||||||
|
echo ""
|
||||||
|
else
|
||||||
|
echo ""
|
||||||
|
print_status "Image built successfully: $IMAGE_NAME"
|
||||||
|
echo ""
|
||||||
|
echo "To deploy to production, run:"
|
||||||
|
echo " $0 deploy"
|
||||||
|
echo ""
|
||||||
|
echo "Or manually:"
|
||||||
|
echo " docker save $IMAGE_NAME | gzip > /tmp/ap.tar.gz"
|
||||||
|
echo " scp /tmp/ap.tar.gz $SERVER:/tmp/"
|
||||||
|
echo " ssh $SERVER 'gunzip -c /tmp/ap.tar.gz | docker load'"
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
71
smoothschedule/.envs.example/.activepieces
Normal file
71
smoothschedule/.envs.example/.activepieces
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
# Activepieces Environment Variables
|
||||||
|
# Copy this file to .envs/.production/.activepieces and fill in values
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
# External URL (for browser iframe embed)
|
||||||
|
AP_FRONTEND_URL=https://automations.smoothschedule.com
|
||||||
|
|
||||||
|
# Internal URL (for Django API calls within Docker network)
|
||||||
|
AP_INTERNAL_URL=http://activepieces:80
|
||||||
|
|
||||||
|
# Security Keys - MUST match between Activepieces and Django
|
||||||
|
# Generate with: openssl rand -hex 32
|
||||||
|
AP_JWT_SECRET=<generate-with-openssl-rand-hex-32>
|
||||||
|
AP_ENCRYPTION_KEY=<generate-with-openssl-rand-hex-16>
|
||||||
|
|
||||||
|
# Platform/Project IDs
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# These are generated when you first create an admin user in Activepieces
|
||||||
|
# Leave blank for initial deployment, then update after setup
|
||||||
|
#
|
||||||
|
# IMPORTANT: After initial setup:
|
||||||
|
# 1. Visit https://automations.smoothschedule.com
|
||||||
|
# 2. Create an admin account (this creates the platform)
|
||||||
|
# 3. Get platform ID from the database:
|
||||||
|
# docker compose exec postgres psql -U <user> -d activepieces -c "SELECT id FROM platform LIMIT 1"
|
||||||
|
# 4. Update AP_PLATFORM_ID here AND in .django file
|
||||||
|
AP_DEFAULT_PROJECT_ID=
|
||||||
|
AP_PLATFORM_ID=
|
||||||
|
|
||||||
|
# Database (using same PostgreSQL as SmoothSchedule, but different database)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AP_POSTGRES_HOST=postgres
|
||||||
|
AP_POSTGRES_PORT=5432
|
||||||
|
AP_POSTGRES_DATABASE=activepieces
|
||||||
|
|
||||||
|
# Generate strong random values for these (separate from main DB credentials)
|
||||||
|
AP_POSTGRES_USERNAME=<random-username>
|
||||||
|
AP_POSTGRES_PASSWORD=<random-password>
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AP_REDIS_HOST=redis
|
||||||
|
AP_REDIS_PORT=6379
|
||||||
|
|
||||||
|
# AI Copilot (optional)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AP_OPENAI_API_KEY=<your-openai-api-key>
|
||||||
|
|
||||||
|
# Execution Settings
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AP_EXECUTION_MODE=UNSANDBOXED
|
||||||
|
AP_TELEMETRY_ENABLED=false
|
||||||
|
|
||||||
|
# Pieces Configuration
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# CLOUD_AND_DB: Fetch pieces from cloud registry and local database
|
||||||
|
AP_PIECES_SOURCE=CLOUD_AND_DB
|
||||||
|
# OFFICIAL_AUTO: Automatically sync official pieces metadata from cloud
|
||||||
|
AP_PIECES_SYNC_MODE=OFFICIAL_AUTO
|
||||||
|
|
||||||
|
# Embedding (required for iframe integration)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AP_EMBEDDING_ENABLED=true
|
||||||
|
|
||||||
|
# Templates (fetch official templates from Activepieces cloud)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AP_TEMPLATES_SOURCE_URL=https://cloud.activepieces.com/api/v1/templates
|
||||||
|
|
||||||
|
# Custom Pieces Registry (Verdaccio - internal)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
VERDACCIO_URL=http://verdaccio:4873
|
||||||
75
smoothschedule/.envs.example/.django
Normal file
75
smoothschedule/.envs.example/.django
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
# Django Environment Variables
|
||||||
|
# Copy this file to .envs/.production/.django and fill in values
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
# General Settings
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
USE_DOCKER=yes
|
||||||
|
DJANGO_SETTINGS_MODULE=config.settings.production
|
||||||
|
DJANGO_SECRET_KEY=<generate-a-strong-secret-key>
|
||||||
|
DJANGO_ALLOWED_HOSTS=.smoothschedule.com,localhost
|
||||||
|
|
||||||
|
# IMPORTANT: Set to False in production
|
||||||
|
DJANGO_DEBUG=False
|
||||||
|
|
||||||
|
# Security
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
DJANGO_SECURE_SSL_REDIRECT=True
|
||||||
|
DJANGO_SECURE_HSTS_INCLUDE_SUBDOMAINS=True
|
||||||
|
DJANGO_SECURE_HSTS_PRELOAD=True
|
||||||
|
DJANGO_SECURE_CONTENT_TYPE_NOSNIFF=True
|
||||||
|
|
||||||
|
# CORS Configuration
|
||||||
|
# Set specific origins in production, not all origins
|
||||||
|
DJANGO_CORS_ALLOWED_ORIGINS=https://smoothschedule.com,https://platform.smoothschedule.com
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
REDIS_URL=redis://redis:6379/0
|
||||||
|
|
||||||
|
# Celery
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
CELERY_FLOWER_USER=<random-username>
|
||||||
|
CELERY_FLOWER_PASSWORD=<random-password>
|
||||||
|
|
||||||
|
# Activepieces Integration
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# URL for Activepieces to call SmoothSchedule API (Docker internal network)
|
||||||
|
SMOOTHSCHEDULE_API_URL=http://django:8000
|
||||||
|
|
||||||
|
# These MUST match the values in .activepieces file
|
||||||
|
AP_FRONTEND_URL=https://automations.smoothschedule.com
|
||||||
|
AP_INTERNAL_URL=http://activepieces:80
|
||||||
|
AP_JWT_SECRET=<copy-from-activepieces-file>
|
||||||
|
AP_ENCRYPTION_KEY=<copy-from-activepieces-file>
|
||||||
|
AP_DEFAULT_PROJECT_ID=<copy-from-activepieces-file>
|
||||||
|
AP_PLATFORM_ID=<copy-from-activepieces-file>
|
||||||
|
|
||||||
|
# Twilio (for SMS 2FA and phone numbers)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
TWILIO_ACCOUNT_SID=<your-twilio-account-sid>
|
||||||
|
TWILIO_AUTH_TOKEN=<your-twilio-auth-token>
|
||||||
|
TWILIO_PHONE_NUMBER=<your-twilio-phone-number>
|
||||||
|
|
||||||
|
# Stripe (for payments)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Use live keys in production (pk_live_*, sk_live_*)
|
||||||
|
STRIPE_PUBLISHABLE_KEY=pk_live_<your-stripe-publishable-key>
|
||||||
|
STRIPE_SECRET_KEY=sk_live_<your-stripe-secret-key>
|
||||||
|
STRIPE_WEBHOOK_SECRET=whsec_<your-stripe-webhook-secret>
|
||||||
|
|
||||||
|
# Mail Server Configuration
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
MAIL_SERVER_SSH_HOST=mail.talova.net
|
||||||
|
MAIL_SERVER_SSH_USER=poduck
|
||||||
|
MAIL_SERVER_DOCKER_CONTAINER=mailserver
|
||||||
|
MAIL_SERVER_SSH_KEY_PATH=/app/.ssh/id_ed25519
|
||||||
|
MAIL_SERVER_SSH_KNOWN_HOSTS_PATH=/app/.ssh/known_hosts
|
||||||
|
|
||||||
|
# AWS S3 / DigitalOcean Spaces (for media storage)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
AWS_ACCESS_KEY_ID=<your-spaces-access-key>
|
||||||
|
AWS_SECRET_ACCESS_KEY=<your-spaces-secret-key>
|
||||||
|
AWS_STORAGE_BUCKET_NAME=smoothschedule
|
||||||
|
AWS_S3_REGION_NAME=nyc3
|
||||||
|
AWS_S3_ENDPOINT_URL=https://nyc3.digitaloceanspaces.com
|
||||||
15
smoothschedule/.envs.example/.postgres
Normal file
15
smoothschedule/.envs.example/.postgres
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
# PostgreSQL Environment Variables
|
||||||
|
# Copy this file to .envs/.production/.postgres and fill in values
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
POSTGRES_HOST=postgres
|
||||||
|
POSTGRES_PORT=5432
|
||||||
|
POSTGRES_DB=smoothschedule
|
||||||
|
|
||||||
|
# Generate strong random values for these (32+ characters recommended)
|
||||||
|
POSTGRES_USER=<random-username>
|
||||||
|
POSTGRES_PASSWORD=<random-password>
|
||||||
|
|
||||||
|
# Construct DATABASE_URL from the above values
|
||||||
|
# Format: postgres://USER:PASSWORD@HOST:PORT/DATABASE
|
||||||
|
DATABASE_URL=postgres://<username>:<password>@postgres:5432/smoothschedule
|
||||||
@@ -31,9 +31,21 @@ AP_OPENAI_API_KEY=sk-proj-2y1XBAloHQ4cIVt2DIqN7sN11WB3TMM0GOMTy-1svntvCMz2dOGJAa
|
|||||||
AP_EXECUTION_MODE=UNSANDBOXED
|
AP_EXECUTION_MODE=UNSANDBOXED
|
||||||
AP_TELEMETRY_ENABLED=false
|
AP_TELEMETRY_ENABLED=false
|
||||||
|
|
||||||
|
# Pieces Configuration
|
||||||
|
# CLOUD_AND_DB: Fetch pieces from cloud registry and local database
|
||||||
|
AP_PIECES_SOURCE=CLOUD_AND_DB
|
||||||
|
# OFFICIAL_AUTO: Automatically sync official pieces metadata from cloud
|
||||||
|
AP_PIECES_SYNC_MODE=OFFICIAL_AUTO
|
||||||
|
|
||||||
# Embedding
|
# Embedding
|
||||||
AP_EMBEDDING_ENABLED=true
|
AP_EMBEDDING_ENABLED=true
|
||||||
|
|
||||||
# Development Pieces (comma-separated list of piece names to load from dist/packages/pieces)
|
# Development Pieces (comma-separated list of piece names to load from dist/packages/pieces)
|
||||||
# The build watcher may fail without nx, but the piece should still load from pre-built dist
|
# The build watcher may fail without nx, but the piece should still load from pre-built dist
|
||||||
AP_DEV_PIECES=smoothschedule
|
AP_DEV_PIECES=smoothschedule,python-code,ruby-code
|
||||||
|
|
||||||
|
# Verdaccio - set to 'none' to skip Verdaccio (pieces are pre-built in image)
|
||||||
|
VERDACCIO_URL=none
|
||||||
|
|
||||||
|
# Templates Source URL - fetch official Activepieces templates from cloud
|
||||||
|
AP_TEMPLATES_SOURCE_URL=https://cloud.activepieces.com/api/v1/templates
|
||||||
|
|||||||
1
smoothschedule/.gitignore
vendored
1
smoothschedule/.gitignore
vendored
@@ -276,6 +276,7 @@ smoothschedule/media/
|
|||||||
.env
|
.env
|
||||||
.envs/*
|
.envs/*
|
||||||
!.envs/.local/
|
!.envs/.local/
|
||||||
|
!.envs.example/
|
||||||
|
|
||||||
# SSH keys for mail server access
|
# SSH keys for mail server access
|
||||||
.ssh/
|
.ssh/
|
||||||
|
|||||||
@@ -110,6 +110,15 @@ http:
|
|||||||
tls:
|
tls:
|
||||||
certResolver: letsencrypt
|
certResolver: letsencrypt
|
||||||
|
|
||||||
|
# Automations subdomain (Activepieces)
|
||||||
|
automations-router:
|
||||||
|
rule: 'Host(`automations.smoothschedule.com`)'
|
||||||
|
entryPoints:
|
||||||
|
- web-secure
|
||||||
|
service: activepieces
|
||||||
|
tls:
|
||||||
|
certResolver: letsencrypt
|
||||||
|
|
||||||
# Wildcard subdomain router for tenant subdomains
|
# Wildcard subdomain router for tenant subdomains
|
||||||
# Uses DNS challenge for wildcard certificate (*.smoothschedule.com)
|
# Uses DNS challenge for wildcard certificate (*.smoothschedule.com)
|
||||||
# Routes to nginx which serves the frontend SPA and proxies /api/ to Django
|
# Routes to nginx which serves the frontend SPA and proxies /api/ to Django
|
||||||
@@ -166,6 +175,11 @@ http:
|
|||||||
servers:
|
servers:
|
||||||
- url: http://nginx:80
|
- url: http://nginx:80
|
||||||
|
|
||||||
|
activepieces:
|
||||||
|
loadBalancer:
|
||||||
|
servers:
|
||||||
|
- url: http://activepieces:80
|
||||||
|
|
||||||
providers:
|
providers:
|
||||||
# https://doc.traefik.io/traefik/master/providers/file/
|
# https://doc.traefik.io/traefik/master/providers/file/
|
||||||
file:
|
file:
|
||||||
|
|||||||
76
smoothschedule/compose/production/verdaccio/config.yaml
Normal file
76
smoothschedule/compose/production/verdaccio/config.yaml
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
# Verdaccio configuration for SmoothSchedule custom Activepieces pieces
|
||||||
|
|
||||||
|
storage: /verdaccio/storage
|
||||||
|
plugins: /verdaccio/plugins
|
||||||
|
|
||||||
|
web:
|
||||||
|
enable: true
|
||||||
|
title: SmoothSchedule NPM Registry
|
||||||
|
|
||||||
|
# Authentication - allow anyone to read, require auth to publish
|
||||||
|
auth:
|
||||||
|
htpasswd:
|
||||||
|
file: /verdaccio/storage/htpasswd
|
||||||
|
# Allow up to 100 users to register
|
||||||
|
max_users: 100
|
||||||
|
|
||||||
|
# Uplink to official npm registry for packages we don't have
|
||||||
|
uplinks:
|
||||||
|
npmjs:
|
||||||
|
url: https://registry.npmjs.org/
|
||||||
|
cache: true
|
||||||
|
|
||||||
|
# Package access rules
|
||||||
|
packages:
|
||||||
|
# Our custom Activepieces pieces - serve locally, allow any publish
|
||||||
|
'@activepieces/piece-smoothschedule':
|
||||||
|
access: $all
|
||||||
|
publish: $all
|
||||||
|
unpublish: $all
|
||||||
|
# No proxy - only serve from local storage
|
||||||
|
|
||||||
|
'@activepieces/piece-python-code':
|
||||||
|
access: $all
|
||||||
|
publish: $all
|
||||||
|
unpublish: $all
|
||||||
|
|
||||||
|
'@activepieces/piece-ruby-code':
|
||||||
|
access: $all
|
||||||
|
publish: $all
|
||||||
|
unpublish: $all
|
||||||
|
|
||||||
|
'@activepieces/piece-interfaces':
|
||||||
|
access: $all
|
||||||
|
publish: $all
|
||||||
|
unpublish: $all
|
||||||
|
|
||||||
|
# All other @activepieces packages - proxy to npm
|
||||||
|
'@activepieces/*':
|
||||||
|
access: $all
|
||||||
|
publish: $authenticated
|
||||||
|
proxy: npmjs
|
||||||
|
|
||||||
|
# All other scoped packages - proxy to npm
|
||||||
|
'@*/*':
|
||||||
|
access: $all
|
||||||
|
publish: $authenticated
|
||||||
|
proxy: npmjs
|
||||||
|
|
||||||
|
# All unscoped packages - proxy to npm
|
||||||
|
'**':
|
||||||
|
access: $all
|
||||||
|
publish: $authenticated
|
||||||
|
proxy: npmjs
|
||||||
|
|
||||||
|
# Server settings
|
||||||
|
server:
|
||||||
|
keepAliveTimeout: 60
|
||||||
|
|
||||||
|
# Middleware
|
||||||
|
middlewares:
|
||||||
|
audit:
|
||||||
|
enabled: true
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
logs:
|
||||||
|
- { type: stdout, format: pretty, level: warn }
|
||||||
@@ -109,12 +109,14 @@ if AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and AWS_STORAGE_BUCKET_NAME:
|
|||||||
"OPTIONS": {
|
"OPTIONS": {
|
||||||
"location": "media",
|
"location": "media",
|
||||||
"file_overwrite": False,
|
"file_overwrite": False,
|
||||||
|
"default_acl": "public-read",
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
"staticfiles": {
|
"staticfiles": {
|
||||||
"BACKEND": "storages.backends.s3.S3Storage",
|
"BACKEND": "storages.backends.s3.S3Storage",
|
||||||
"OPTIONS": {
|
"OPTIONS": {
|
||||||
"location": "static",
|
"location": "static",
|
||||||
|
"default_acl": "public-read",
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.conf.urls.static import static
|
from django.conf.urls.static import static
|
||||||
from django.contrib import admin
|
from django.contrib import admin
|
||||||
|
from django.http import FileResponse, Http404
|
||||||
from django.urls import include
|
from django.urls import include
|
||||||
from django.urls import path
|
from django.urls import path
|
||||||
from django.views import defaults as default_views
|
from django.views import defaults as default_views
|
||||||
@@ -9,6 +10,28 @@ from django.views.generic import TemplateView
|
|||||||
from drf_spectacular.views import SpectacularAPIView
|
from drf_spectacular.views import SpectacularAPIView
|
||||||
from drf_spectacular.views import SpectacularSwaggerView
|
from drf_spectacular.views import SpectacularSwaggerView
|
||||||
from rest_framework.authtoken.views import obtain_auth_token
|
from rest_framework.authtoken.views import obtain_auth_token
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
def serve_static_image(request, filename):
|
||||||
|
"""Serve static images with CORS headers for Activepieces integration."""
|
||||||
|
allowed_files = {
|
||||||
|
'logo-branding.png': 'image/png',
|
||||||
|
'python-logo.svg': 'image/svg+xml',
|
||||||
|
'ruby-logo.svg': 'image/svg+xml',
|
||||||
|
}
|
||||||
|
if filename not in allowed_files:
|
||||||
|
raise Http404("Image not found")
|
||||||
|
|
||||||
|
# Try to find the file in static directories
|
||||||
|
for static_dir in settings.STATICFILES_DIRS:
|
||||||
|
file_path = os.path.join(static_dir, 'images', filename)
|
||||||
|
if os.path.exists(file_path):
|
||||||
|
response = FileResponse(open(file_path, 'rb'), content_type=allowed_files[filename])
|
||||||
|
response['Access-Control-Allow-Origin'] = '*'
|
||||||
|
response['Cache-Control'] = 'public, max-age=86400'
|
||||||
|
return response
|
||||||
|
raise Http404("Image not found")
|
||||||
|
|
||||||
from smoothschedule.identity.users.api_views import (
|
from smoothschedule.identity.users.api_views import (
|
||||||
login_view, current_user_view, logout_view, send_verification_email, verify_email,
|
login_view, current_user_view, logout_view, send_verification_email, verify_email,
|
||||||
@@ -45,6 +68,8 @@ from smoothschedule.identity.core.api_views import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
urlpatterns = [
|
urlpatterns = [
|
||||||
|
# Static images with CORS (for Activepieces integration)
|
||||||
|
path("images/<str:filename>", serve_static_image, name="serve_static_image"),
|
||||||
# Django Admin, use {% url 'admin:index' %}
|
# Django Admin, use {% url 'admin:index' %}
|
||||||
path(settings.ADMIN_URL, admin.site.urls),
|
path(settings.ADMIN_URL, admin.site.urls),
|
||||||
# User management
|
# User management
|
||||||
@@ -64,6 +89,8 @@ urlpatterns = [
|
|||||||
# ...
|
# ...
|
||||||
# Media files
|
# Media files
|
||||||
*static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT),
|
*static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT),
|
||||||
|
# Static files (for development)
|
||||||
|
*static(settings.STATIC_URL, document_root=settings.STATICFILES_DIRS[0] if settings.STATICFILES_DIRS else None),
|
||||||
]
|
]
|
||||||
|
|
||||||
# API URLS
|
# API URLS
|
||||||
|
|||||||
@@ -4,6 +4,36 @@ volumes:
|
|||||||
smoothschedule_local_redis_data: {}
|
smoothschedule_local_redis_data: {}
|
||||||
smoothschedule_local_activepieces_cache: {}
|
smoothschedule_local_activepieces_cache: {}
|
||||||
|
|
||||||
|
# Memory limits for local development
|
||||||
|
# Prevents containers from consuming all system RAM and freezing
|
||||||
|
# Adjust if you have more/less RAM available
|
||||||
|
x-memory-limits:
|
||||||
|
small: &mem-small
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 128M
|
||||||
|
medium: &mem-medium
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 256M
|
||||||
|
large: &mem-large
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 512M
|
||||||
|
xlarge: &mem-xlarge
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 768M
|
||||||
|
database: &mem-database
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 512M
|
||||||
|
|
||||||
services:
|
services:
|
||||||
django: &django
|
django: &django
|
||||||
build:
|
build:
|
||||||
@@ -11,6 +41,7 @@ services:
|
|||||||
dockerfile: ./compose/local/django/Dockerfile
|
dockerfile: ./compose/local/django/Dockerfile
|
||||||
image: smoothschedule_local_django
|
image: smoothschedule_local_django
|
||||||
container_name: smoothschedule_local_django
|
container_name: smoothschedule_local_django
|
||||||
|
<<: *mem-large
|
||||||
depends_on:
|
depends_on:
|
||||||
- postgres
|
- postgres
|
||||||
- redis
|
- redis
|
||||||
|
|||||||
@@ -2,8 +2,48 @@ volumes:
|
|||||||
production_postgres_data: {}
|
production_postgres_data: {}
|
||||||
production_postgres_data_backups: {}
|
production_postgres_data_backups: {}
|
||||||
production_traefik: {}
|
production_traefik: {}
|
||||||
|
|
||||||
production_redis_data: {}
|
production_redis_data: {}
|
||||||
|
production_activepieces_cache: {}
|
||||||
|
production_verdaccio_storage: {}
|
||||||
|
|
||||||
|
# Memory limits for 2GB RAM server
|
||||||
|
# Total allocated: ~1.6GB, leaving ~400MB for system/OS
|
||||||
|
x-memory-limits:
|
||||||
|
small: &mem-small
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 64M
|
||||||
|
reservations:
|
||||||
|
memory: 32M
|
||||||
|
medium: &mem-medium
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 128M
|
||||||
|
reservations:
|
||||||
|
memory: 64M
|
||||||
|
large: &mem-large
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 256M
|
||||||
|
reservations:
|
||||||
|
memory: 128M
|
||||||
|
xlarge: &mem-xlarge
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 384M
|
||||||
|
reservations:
|
||||||
|
memory: 192M
|
||||||
|
database: &mem-database
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 512M
|
||||||
|
reservations:
|
||||||
|
memory: 256M
|
||||||
|
|
||||||
services:
|
services:
|
||||||
django: &django
|
django: &django
|
||||||
@@ -12,6 +52,7 @@ services:
|
|||||||
dockerfile: ./compose/production/django/Dockerfile
|
dockerfile: ./compose/production/django/Dockerfile
|
||||||
|
|
||||||
image: smoothschedule_production_django
|
image: smoothschedule_production_django
|
||||||
|
<<: *mem-large
|
||||||
depends_on:
|
depends_on:
|
||||||
- postgres
|
- postgres
|
||||||
- redis
|
- redis
|
||||||
@@ -86,3 +127,27 @@ services:
|
|||||||
- ./.envs/.production/.django
|
- ./.envs/.production/.django
|
||||||
volumes:
|
volumes:
|
||||||
- production_postgres_data_backups:/backups:z
|
- production_postgres_data_backups:/backups:z
|
||||||
|
|
||||||
|
verdaccio:
|
||||||
|
image: verdaccio/verdaccio:5
|
||||||
|
restart: unless-stopped
|
||||||
|
volumes:
|
||||||
|
- production_verdaccio_storage:/verdaccio/storage
|
||||||
|
- ./compose/production/verdaccio/config.yaml:/verdaccio/conf/config.yaml:ro
|
||||||
|
environment:
|
||||||
|
- VERDACCIO_PORT=4873
|
||||||
|
|
||||||
|
activepieces:
|
||||||
|
build:
|
||||||
|
context: ../activepieces-fork
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
image: smoothschedule_production_activepieces
|
||||||
|
restart: unless-stopped
|
||||||
|
depends_on:
|
||||||
|
- postgres
|
||||||
|
- redis
|
||||||
|
- verdaccio
|
||||||
|
env_file:
|
||||||
|
- ./.envs/.production/.activepieces
|
||||||
|
volumes:
|
||||||
|
- production_activepieces_cache:/root/.activepieces
|
||||||
|
|||||||
278
smoothschedule/scripts/init-production.sh
Executable file
278
smoothschedule/scripts/init-production.sh
Executable file
@@ -0,0 +1,278 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# ==============================================================================
|
||||||
|
# SmoothSchedule Production Initialization Script
|
||||||
|
#
|
||||||
|
# Run this ONCE on a fresh production server to set up everything.
|
||||||
|
# Subsequent deployments should use deploy.sh
|
||||||
|
#
|
||||||
|
# Usage: ./scripts/init-production.sh
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Colors for output
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
print_status() { echo -e "${GREEN}>>> $1${NC}"; }
|
||||||
|
print_warning() { echo -e "${YELLOW}>>> $1${NC}"; }
|
||||||
|
print_error() { echo -e "${RED}>>> $1${NC}"; }
|
||||||
|
print_info() { echo -e "${BLUE}>>> $1${NC}"; }
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "==========================================="
|
||||||
|
echo " SmoothSchedule Production Initialization"
|
||||||
|
echo "==========================================="
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check we're in the right directory
|
||||||
|
if [ ! -f "docker-compose.production.yml" ]; then
|
||||||
|
print_error "Must run from smoothschedule/smoothschedule directory"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 1: Check/Create Environment Files
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 1: Checking environment files..."
|
||||||
|
|
||||||
|
if [ ! -d ".envs/.production" ]; then
|
||||||
|
print_warning "Production environment not found. Creating from templates..."
|
||||||
|
mkdir -p .envs/.production
|
||||||
|
|
||||||
|
if [ -d ".envs.example" ]; then
|
||||||
|
cp .envs.example/.django .envs/.production/.django
|
||||||
|
cp .envs.example/.postgres .envs/.production/.postgres
|
||||||
|
cp .envs.example/.activepieces .envs/.production/.activepieces
|
||||||
|
print_info "Copied template files to .envs/.production/"
|
||||||
|
print_warning "IMPORTANT: Edit these files with your actual values before continuing!"
|
||||||
|
print_warning "Required files:"
|
||||||
|
echo " - .envs/.production/.django"
|
||||||
|
echo " - .envs/.production/.postgres"
|
||||||
|
echo " - .envs/.production/.activepieces"
|
||||||
|
echo ""
|
||||||
|
read -p "Press Enter after you've edited the files, or Ctrl+C to abort..."
|
||||||
|
else
|
||||||
|
print_error "Template files not found in .envs.example/"
|
||||||
|
print_error "Please create .envs/.production/ manually with .django, .postgres, and .activepieces"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
print_status "Environment files exist."
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 2: Generate Security Keys (if not set)
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 2: Checking security keys..."
|
||||||
|
|
||||||
|
DJANGO_FILE=".envs/.production/.django"
|
||||||
|
AP_FILE=".envs/.production/.activepieces"
|
||||||
|
|
||||||
|
# Check if DJANGO_SECRET_KEY is a placeholder
|
||||||
|
if grep -q "<generate" "$DJANGO_FILE" 2>/dev/null; then
|
||||||
|
print_warning "Generating Django secret key..."
|
||||||
|
NEW_SECRET=$(openssl rand -hex 32)
|
||||||
|
sed -i "s/<generate-a-strong-secret-key>/$NEW_SECRET/" "$DJANGO_FILE"
|
||||||
|
print_info "Django secret key generated."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if AP_JWT_SECRET is a placeholder
|
||||||
|
if grep -q "<generate" "$AP_FILE" 2>/dev/null; then
|
||||||
|
print_warning "Generating Activepieces JWT secret..."
|
||||||
|
NEW_JWT=$(openssl rand -hex 32)
|
||||||
|
sed -i "s/<generate-with-openssl-rand-hex-32>/$NEW_JWT/" "$AP_FILE"
|
||||||
|
# Also update in Django file
|
||||||
|
sed -i "s/<copy-from-activepieces-file>/$NEW_JWT/" "$DJANGO_FILE" 2>/dev/null || true
|
||||||
|
print_info "JWT secret generated."
|
||||||
|
|
||||||
|
print_warning "Generating Activepieces encryption key..."
|
||||||
|
NEW_ENC=$(openssl rand -hex 16)
|
||||||
|
sed -i "s/<generate-with-openssl-rand-hex-16>/$NEW_ENC/" "$AP_FILE"
|
||||||
|
print_info "Encryption key generated."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 3: Pull/Build Docker Images
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 3: Building Docker images..."
|
||||||
|
|
||||||
|
# Check if Activepieces image exists or needs to be pulled/loaded
|
||||||
|
if ! docker images | grep -q "smoothschedule_production_activepieces"; then
|
||||||
|
print_warning "Activepieces image not found locally."
|
||||||
|
print_info "The production server typically cannot build this image (requires 4GB+ RAM)."
|
||||||
|
print_info "Options:"
|
||||||
|
echo " 1. Build on a dev machine and transfer:"
|
||||||
|
echo " cd activepieces-fork"
|
||||||
|
echo " docker build -t smoothschedule_production_activepieces ."
|
||||||
|
echo " docker save smoothschedule_production_activepieces | gzip > /tmp/ap.tar.gz"
|
||||||
|
echo " scp /tmp/ap.tar.gz server:/tmp/"
|
||||||
|
echo " ssh server 'gunzip -c /tmp/ap.tar.gz | docker load'"
|
||||||
|
echo ""
|
||||||
|
read -p "Press Enter after you've loaded the Activepieces image, or type 'skip' to continue anyway: " SKIP_AP
|
||||||
|
|
||||||
|
if [ "$SKIP_AP" != "skip" ]; then
|
||||||
|
if ! docker images | grep -q "smoothschedule_production_activepieces"; then
|
||||||
|
print_error "Activepieces image still not found. Please load it first."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Build other images
|
||||||
|
print_info "Building Django and other images..."
|
||||||
|
docker compose -f docker-compose.production.yml build django nginx
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 4: Start Core Services
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 4: Starting core services..."
|
||||||
|
|
||||||
|
docker compose -f docker-compose.production.yml up -d postgres redis
|
||||||
|
print_info "Waiting for PostgreSQL to be ready..."
|
||||||
|
sleep 10
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 5: Create Databases
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 5: Setting up databases..."
|
||||||
|
|
||||||
|
# Get credentials from env files
|
||||||
|
POSTGRES_USER=$(grep -E "^POSTGRES_USER=" .envs/.production/.postgres | cut -d= -f2)
|
||||||
|
POSTGRES_PASSWORD=$(grep -E "^POSTGRES_PASSWORD=" .envs/.production/.postgres | cut -d= -f2)
|
||||||
|
AP_DB_USER=$(grep -E "^AP_POSTGRES_USERNAME=" .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
AP_DB_PASS=$(grep -E "^AP_POSTGRES_PASSWORD=" .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
AP_DB_NAME=$(grep -E "^AP_POSTGRES_DATABASE=" .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
|
||||||
|
# Wait for PostgreSQL to accept connections
|
||||||
|
for i in {1..30}; do
|
||||||
|
if docker compose -f docker-compose.production.yml exec -T postgres pg_isready -U "$POSTGRES_USER" > /dev/null 2>&1; then
|
||||||
|
print_info "PostgreSQL is ready."
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
echo " Waiting for PostgreSQL... ($i/30)"
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
# Create Activepieces database and user
|
||||||
|
print_info "Creating Activepieces database..."
|
||||||
|
docker compose -f docker-compose.production.yml exec -T postgres psql -U "$POSTGRES_USER" -d postgres << EOSQL
|
||||||
|
-- Create user if not exists
|
||||||
|
DO \$\$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT FROM pg_roles WHERE rolname = '$AP_DB_USER') THEN
|
||||||
|
CREATE USER "$AP_DB_USER" WITH PASSWORD '$AP_DB_PASS';
|
||||||
|
END IF;
|
||||||
|
END
|
||||||
|
\$\$;
|
||||||
|
|
||||||
|
-- Create database if not exists
|
||||||
|
SELECT 'CREATE DATABASE $AP_DB_NAME OWNER "$AP_DB_USER"'
|
||||||
|
WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = '$AP_DB_NAME')\gexec
|
||||||
|
EOSQL
|
||||||
|
|
||||||
|
print_info "Databases configured."
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 6: Start All Services
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 6: Starting all services..."
|
||||||
|
|
||||||
|
docker compose -f docker-compose.production.yml up -d
|
||||||
|
|
||||||
|
print_info "Waiting for services to start..."
|
||||||
|
sleep 15
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 7: Run Django Migrations
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 7: Running Django migrations..."
|
||||||
|
|
||||||
|
docker compose -f docker-compose.production.yml exec -T django python manage.py migrate
|
||||||
|
docker compose -f docker-compose.production.yml exec -T django python manage.py collectstatic --noinput
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 8: Create Django Superuser
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 8: Django superuser..."
|
||||||
|
|
||||||
|
echo "Would you like to create a Django superuser? (y/n)"
|
||||||
|
read -r CREATE_SUPER
|
||||||
|
if [ "$CREATE_SUPER" = "y" ]; then
|
||||||
|
docker compose -f docker-compose.production.yml exec django python manage.py createsuperuser
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 9: Initialize Activepieces Platform
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 9: Initializing Activepieces platform..."
|
||||||
|
|
||||||
|
# Check if Activepieces is healthy
|
||||||
|
AP_HEALTHY=false
|
||||||
|
for i in {1..30}; do
|
||||||
|
if curl -sf http://localhost:80/api/v1/health > /dev/null 2>&1; then
|
||||||
|
AP_HEALTHY=true
|
||||||
|
print_info "Activepieces is healthy."
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
echo " Waiting for Activepieces... ($i/30)"
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ "$AP_HEALTHY" = "false" ]; then
|
||||||
|
print_warning "Activepieces health check timed out. Check logs with:"
|
||||||
|
echo " docker compose -f docker-compose.production.yml logs activepieces"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if platform ID is configured
|
||||||
|
AP_PLATFORM_ID=$(grep -E "^AP_PLATFORM_ID=" .envs/.production/.activepieces | cut -d= -f2)
|
||||||
|
|
||||||
|
if [ -z "$AP_PLATFORM_ID" ]; then
|
||||||
|
print_warning "Activepieces platform not yet initialized."
|
||||||
|
echo ""
|
||||||
|
print_info "To complete Activepieces setup:"
|
||||||
|
echo " 1. Visit https://automations.smoothschedule.com"
|
||||||
|
echo " 2. Sign up to create the first admin user (this creates the platform)"
|
||||||
|
echo " 3. Get the platform ID:"
|
||||||
|
echo " docker compose -f docker-compose.production.yml exec postgres psql -U $AP_DB_USER -d $AP_DB_NAME -c 'SELECT id FROM platform'"
|
||||||
|
echo " 4. Update AP_PLATFORM_ID in both:"
|
||||||
|
echo " - .envs/.production/.activepieces"
|
||||||
|
echo " - .envs/.production/.django"
|
||||||
|
echo " 5. Restart services:"
|
||||||
|
echo " docker compose -f docker-compose.production.yml restart"
|
||||||
|
echo ""
|
||||||
|
else
|
||||||
|
print_info "Activepieces platform ID configured: $AP_PLATFORM_ID"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# Step 10: Show Status
|
||||||
|
# ==============================================================================
|
||||||
|
print_status "Step 10: Checking service status..."
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
docker compose -f docker-compose.production.yml ps
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "==========================================="
|
||||||
|
print_status "Initialization Complete!"
|
||||||
|
echo "==========================================="
|
||||||
|
echo ""
|
||||||
|
echo "Your application should be available at:"
|
||||||
|
echo " - https://smoothschedule.com"
|
||||||
|
echo " - https://platform.smoothschedule.com"
|
||||||
|
echo " - https://automations.smoothschedule.com"
|
||||||
|
echo ""
|
||||||
|
echo "Next steps:"
|
||||||
|
echo " 1. Complete Activepieces setup (if platform ID not set)"
|
||||||
|
echo " 2. Create your first tenant via Django admin"
|
||||||
|
echo " 3. Run: python manage.py provision_ap_connections"
|
||||||
|
echo ""
|
||||||
|
echo "Useful commands:"
|
||||||
|
echo " View logs: docker compose -f docker-compose.production.yml logs -f"
|
||||||
|
echo " Restart: docker compose -f docker-compose.production.yml restart"
|
||||||
|
echo " Django shell: docker compose -f docker-compose.production.yml exec django python manage.py shell"
|
||||||
|
echo ""
|
||||||
@@ -0,0 +1,981 @@
|
|||||||
|
"""
|
||||||
|
Comprehensive unit tests for Payments Views - Additional Coverage.
|
||||||
|
|
||||||
|
These tests cover the remaining uncovered lines in views.py to reach 100% coverage.
|
||||||
|
All tests use mocks to avoid database access for fast execution.
|
||||||
|
"""
|
||||||
|
from unittest.mock import Mock, patch, MagicMock, PropertyMock, call
|
||||||
|
from rest_framework.test import APIRequestFactory, force_authenticate
|
||||||
|
from rest_framework import status
|
||||||
|
import pytest
|
||||||
|
from decimal import Decimal
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import stripe
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ApiKeysView POST Tests (Missing Coverage)
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestApiKeysViewPost:
|
||||||
|
"""Test ApiKeysView POST method comprehensively."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.validate_stripe_keys')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.timezone.now')
|
||||||
|
def test_post_saves_valid_keys(self, mock_now, mock_validate):
|
||||||
|
"""Test POST successfully saves valid API keys."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysView
|
||||||
|
|
||||||
|
# Arrange
|
||||||
|
mock_now.return_value = datetime(2024, 1, 1, 12, 0, 0)
|
||||||
|
mock_validate.return_value = {
|
||||||
|
'valid': True,
|
||||||
|
'account_id': 'acct_123',
|
||||||
|
'account_name': 'Test Account'
|
||||||
|
}
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'secret_key': 'sk_test_validkey123',
|
||||||
|
'publishable_key': 'pk_test_validkey456'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/api-keys/', data, format='json')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
|
||||||
|
# Simulate authenticated user
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
|
||||||
|
view = ApiKeysView.as_view()
|
||||||
|
# Mock the tenant property on the request
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_201_CREATED
|
||||||
|
assert mock_tenant.stripe_secret_key == 'sk_test_validkey123'
|
||||||
|
assert mock_tenant.stripe_publishable_key == 'pk_test_validkey456'
|
||||||
|
assert mock_tenant.stripe_api_key_status == 'active'
|
||||||
|
assert mock_tenant.payment_mode == 'direct_api'
|
||||||
|
assert mock_tenant.stripe_api_key_account_id == 'acct_123'
|
||||||
|
assert mock_tenant.stripe_api_key_account_name == 'Test Account'
|
||||||
|
assert mock_tenant.stripe_api_key_error == ''
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
def test_post_requires_both_keys(self):
|
||||||
|
"""Test POST returns error when keys are missing."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysView
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {'secret_key': ''}
|
||||||
|
request = factory.post('/payments/api-keys/', data, format='json')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'Both secret_key and publishable_key are required' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.validate_stripe_keys')
|
||||||
|
def test_post_returns_error_for_invalid_keys(self, mock_validate):
|
||||||
|
"""Test POST returns error when validation fails."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysView
|
||||||
|
|
||||||
|
mock_validate.return_value = {
|
||||||
|
'valid': False,
|
||||||
|
'error': 'Invalid secret key'
|
||||||
|
}
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'secret_key': 'sk_test_invalid',
|
||||||
|
'publishable_key': 'pk_test_invalid'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/api-keys/', data, format='json')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'Invalid secret key' in response.data['error']
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# validate_stripe_keys Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestValidateStripeKeys:
|
||||||
|
"""Test validate_stripe_keys helper function."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_validates_live_keys_successfully(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test validation of live Stripe keys."""
|
||||||
|
from smoothschedule.commerce.payments.views import validate_stripe_keys
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_live_platform'
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_123'
|
||||||
|
mock_account.get.return_value = {'name': 'Test Business'}
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
result = validate_stripe_keys('sk_live_abc123', 'pk_live_def456')
|
||||||
|
|
||||||
|
assert result['valid'] is True
|
||||||
|
assert result['account_id'] == 'acct_123'
|
||||||
|
assert result['environment'] == 'live'
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_validates_test_keys_successfully(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test validation of test Stripe keys."""
|
||||||
|
from smoothschedule.commerce.payments.views import validate_stripe_keys
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_test_123'
|
||||||
|
# Mock the .get() method to return the email
|
||||||
|
mock_account.get.side_effect = lambda key, default='': {
|
||||||
|
'business_profile': {},
|
||||||
|
'email': 'test@example.com'
|
||||||
|
}.get(key, default)
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
result = validate_stripe_keys('sk_test_abc123', 'pk_test_def456')
|
||||||
|
|
||||||
|
assert result['valid'] is True
|
||||||
|
assert result['account_id'] == 'acct_test_123'
|
||||||
|
assert result['account_name'] == 'test@example.com'
|
||||||
|
assert result['environment'] == 'test'
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_rejects_invalid_publishable_key_format(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test validation fails for invalid publishable key format."""
|
||||||
|
from smoothschedule.commerce.payments.views import validate_stripe_keys
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
# Mock account retrieval to succeed (secret key is valid)
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_123'
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
result = validate_stripe_keys('sk_test_abc123', 'invalid_key')
|
||||||
|
|
||||||
|
assert result['valid'] is False
|
||||||
|
assert 'Invalid publishable key format' in result['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_authentication_error(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test validation handles Stripe authentication errors."""
|
||||||
|
from smoothschedule.commerce.payments.views import validate_stripe_keys
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_retrieve.side_effect = stripe.error.AuthenticationError('Invalid key')
|
||||||
|
|
||||||
|
result = validate_stripe_keys('sk_test_invalid', 'pk_test_valid')
|
||||||
|
|
||||||
|
assert result['valid'] is False
|
||||||
|
assert 'Invalid secret key' in result['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_generic_stripe_error(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test validation handles generic Stripe errors."""
|
||||||
|
from smoothschedule.commerce.payments.views import validate_stripe_keys
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_retrieve.side_effect = stripe.error.StripeError('API error')
|
||||||
|
|
||||||
|
result = validate_stripe_keys('sk_test_key', 'pk_test_key')
|
||||||
|
|
||||||
|
assert result['valid'] is False
|
||||||
|
assert 'API error' in result['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_uses_business_profile_name(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test uses business_profile.name when available."""
|
||||||
|
from smoothschedule.commerce.payments.views import validate_stripe_keys
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_123'
|
||||||
|
mock_account.get.side_effect = lambda key, default=None: {
|
||||||
|
'business_profile': {'name': 'My Business'},
|
||||||
|
'email': 'fallback@example.com'
|
||||||
|
}.get(key, default)
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
result = validate_stripe_keys('sk_test_key', 'pk_test_key')
|
||||||
|
|
||||||
|
assert result['valid'] is True
|
||||||
|
assert result['account_name'] == 'My Business'
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ApiKeysRevalidateView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestApiKeysRevalidateView:
|
||||||
|
"""Test ApiKeysRevalidateView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_error_when_no_keys_configured(self):
|
||||||
|
"""Test returns 400 when no API keys are configured."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysRevalidateView
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/api-keys/revalidate/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = Mock(stripe_secret_key=None)
|
||||||
|
|
||||||
|
view = ApiKeysRevalidateView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'No API keys configured' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.timezone.now')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_revalidates_keys_successfully(self, mock_settings, mock_now, mock_retrieve):
|
||||||
|
"""Test successful revalidation of stored keys."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysRevalidateView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_now.return_value = datetime(2024, 1, 1, 12, 0, 0)
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_123'
|
||||||
|
mock_account.get.return_value = {'name': 'Updated Business'}
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_secret_key = 'sk_test_existing'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/api-keys/revalidate/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysRevalidateView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['valid'] is True
|
||||||
|
assert response.data['account_id'] == 'acct_123'
|
||||||
|
assert mock_tenant.stripe_api_key_status == 'active'
|
||||||
|
assert mock_tenant.stripe_api_key_error == ''
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_authentication_error_on_revalidation(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test handles authentication error during revalidation."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysRevalidateView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_retrieve.side_effect = stripe.error.AuthenticationError('Invalid')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_secret_key = 'sk_test_invalid'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/api-keys/revalidate/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysRevalidateView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['valid'] is False
|
||||||
|
assert 'Invalid secret key' in response.data['error']
|
||||||
|
assert mock_tenant.stripe_api_key_status == 'invalid'
|
||||||
|
assert mock_tenant.stripe_api_key_error == 'Invalid secret key'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_stripe_error_on_revalidation(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test handles generic Stripe error during revalidation."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysRevalidateView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_retrieve.side_effect = stripe.error.StripeError('Network error')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_secret_key = 'sk_test_key'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/api-keys/revalidate/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysRevalidateView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['valid'] is False
|
||||||
|
assert 'Network error' in response.data['error']
|
||||||
|
assert mock_tenant.stripe_api_key_status == 'invalid'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ApiKeysDeleteView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestApiKeysDeleteView:
|
||||||
|
"""Test ApiKeysDeleteView comprehensively."""
|
||||||
|
|
||||||
|
def test_deletes_api_keys_successfully(self):
|
||||||
|
"""Test DELETE successfully removes API keys."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysDeleteView
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'direct_api'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/api-keys/delete/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['success'] is True
|
||||||
|
assert mock_tenant.stripe_secret_key == ''
|
||||||
|
assert mock_tenant.stripe_publishable_key == ''
|
||||||
|
assert mock_tenant.stripe_api_key_status == ''
|
||||||
|
assert mock_tenant.stripe_api_key_validated_at is None
|
||||||
|
assert mock_tenant.payment_mode == 'none'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
def test_deletes_keys_without_changing_connect_mode(self):
|
||||||
|
"""Test DELETE preserves connect payment mode."""
|
||||||
|
from smoothschedule.commerce.payments.views import ApiKeysDeleteView
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'connect'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/api-keys/delete/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ApiKeysDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
# Payment mode should remain 'connect', not changed to 'none'
|
||||||
|
assert mock_tenant.payment_mode == 'connect'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ConnectOnboardView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestConnectOnboardView:
|
||||||
|
"""Test ConnectOnboardView comprehensively."""
|
||||||
|
|
||||||
|
def test_requires_refresh_and_return_urls(self):
|
||||||
|
"""Test POST requires refresh_url and return_url."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectOnboardView
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {}
|
||||||
|
request = factory.post('/payments/connect/onboard/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = ConnectOnboardView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'refresh_url and return_url are required' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.AccountLink.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_new_connect_account(self, mock_settings, mock_account_create, mock_link_create):
|
||||||
|
"""Test creates new Connect account when none exists."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectOnboardView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_new123'
|
||||||
|
mock_account_create.return_value = mock_account
|
||||||
|
|
||||||
|
mock_link = Mock()
|
||||||
|
mock_link.url = 'https://connect.stripe.com/setup/abc123'
|
||||||
|
mock_link_create.return_value = mock_link
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'New Business'
|
||||||
|
mock_tenant.schema_name = 'newbiz'
|
||||||
|
mock_tenant.contact_email = 'contact@newbiz.com'
|
||||||
|
mock_tenant.stripe_connect_id = None
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'refresh_url': 'http://example.com/refresh',
|
||||||
|
'return_url': 'http://example.com/return'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/connect/onboard/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectOnboardView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['stripe_account_id'] == 'acct_new123'
|
||||||
|
assert response.data['url'] == 'https://connect.stripe.com/setup/abc123'
|
||||||
|
assert mock_tenant.stripe_connect_id == 'acct_new123'
|
||||||
|
assert mock_tenant.stripe_connect_status == 'onboarding'
|
||||||
|
assert mock_tenant.payment_mode == 'connect'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.AccountLink.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_link_for_existing_account(self, mock_settings, mock_link_create):
|
||||||
|
"""Test creates onboarding link for existing Connect account."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectOnboardView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
mock_link = Mock()
|
||||||
|
mock_link.url = 'https://connect.stripe.com/setup/existing'
|
||||||
|
mock_link_create.return_value = mock_link
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_existing'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'refresh_url': 'http://example.com/refresh',
|
||||||
|
'return_url': 'http://example.com/return'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/connect/onboard/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectOnboardView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['url'] == 'https://connect.stripe.com/setup/existing'
|
||||||
|
mock_link_create.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_stripe_error(self, mock_settings, mock_account_create):
|
||||||
|
"""Test handles Stripe API errors."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectOnboardView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_account_create.side_effect = stripe.error.StripeError('API error')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'Test'
|
||||||
|
mock_tenant.schema_name = 'test'
|
||||||
|
mock_tenant.contact_email = None
|
||||||
|
mock_tenant.stripe_connect_id = None
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'refresh_url': 'http://example.com/refresh',
|
||||||
|
'return_url': 'http://example.com/return'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/connect/onboard/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectOnboardView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
assert 'API error' in response.data['error']
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ConnectRefreshLinkView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestConnectRefreshLinkView:
|
||||||
|
"""Test ConnectRefreshLinkView comprehensively."""
|
||||||
|
|
||||||
|
def test_requires_refresh_and_return_urls(self):
|
||||||
|
"""Test POST requires both URLs."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshLinkView
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {'refresh_url': 'http://example.com/refresh'}
|
||||||
|
request = factory.post('/payments/connect/refresh-link/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = ConnectRefreshLinkView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'refresh_url and return_url are required' in response.data['error']
|
||||||
|
|
||||||
|
def test_returns_error_when_no_connect_account(self):
|
||||||
|
"""Test returns 400 when no Connect account exists."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshLinkView
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'refresh_url': 'http://example.com/refresh',
|
||||||
|
'return_url': 'http://example.com/return'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/connect/refresh-link/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = Mock(stripe_connect_id=None)
|
||||||
|
|
||||||
|
view = ConnectRefreshLinkView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'No Connect account exists' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.AccountLink.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_refresh_link_successfully(self, mock_settings, mock_link_create):
|
||||||
|
"""Test successfully creates a new onboarding link."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshLinkView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
mock_link = Mock()
|
||||||
|
mock_link.url = 'https://connect.stripe.com/setup/refreshed'
|
||||||
|
mock_link_create.return_value = mock_link
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'refresh_url': 'http://example.com/refresh',
|
||||||
|
'return_url': 'http://example.com/return'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/connect/refresh-link/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectRefreshLinkView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['url'] == 'https://connect.stripe.com/setup/refreshed'
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.AccountLink.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_stripe_error(self, mock_settings, mock_link_create):
|
||||||
|
"""Test handles Stripe API errors."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshLinkView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_link_create.side_effect = stripe.error.StripeError('Link creation failed')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
data = {
|
||||||
|
'refresh_url': 'http://example.com/refresh',
|
||||||
|
'return_url': 'http://example.com/return'
|
||||||
|
}
|
||||||
|
request = factory.post('/payments/connect/refresh-link/', data, format='json')
|
||||||
|
|
||||||
|
mock_user = Mock(is_authenticated=True)
|
||||||
|
force_authenticate(request, user=mock_user)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectRefreshLinkView.as_view()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
assert 'Link creation failed' in response.data['error']
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ConnectAccountSessionView Tests (Embedded Connect)
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestConnectAccountSessionView:
|
||||||
|
"""Test ConnectAccountSessionView for embedded Connect."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.AccountSession.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_custom_account_when_none_exists(self, mock_settings, mock_account_create, mock_session_create):
|
||||||
|
"""Test creates Custom Connect account for embedded onboarding."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectAccountSessionView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_settings.STRIPE_PUBLISHABLE_KEY = 'pk_test_platform'
|
||||||
|
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.id = 'acct_custom123'
|
||||||
|
mock_account_create.return_value = mock_account
|
||||||
|
|
||||||
|
mock_session = Mock()
|
||||||
|
mock_session.client_secret = 'cas_secret_abc123'
|
||||||
|
mock_session_create.return_value = mock_session
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.schema_name = 'test'
|
||||||
|
mock_tenant.contact_email = 'test@example.com'
|
||||||
|
mock_tenant.stripe_connect_id = None
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/account-session/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectAccountSessionView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['client_secret'] == 'cas_secret_abc123'
|
||||||
|
assert response.data['stripe_account_id'] == 'acct_custom123'
|
||||||
|
assert response.data['publishable_key'] == 'pk_test_platform'
|
||||||
|
assert mock_tenant.stripe_connect_id == 'acct_custom123'
|
||||||
|
assert mock_tenant.stripe_connect_status == 'onboarding'
|
||||||
|
assert mock_tenant.payment_mode == 'connect'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
# Verify Custom account was created with correct params
|
||||||
|
mock_account_create.assert_called_once_with(
|
||||||
|
type='custom',
|
||||||
|
country='US',
|
||||||
|
email='test@example.com',
|
||||||
|
capabilities={
|
||||||
|
'card_payments': {'requested': True},
|
||||||
|
'transfers': {'requested': True},
|
||||||
|
},
|
||||||
|
business_type='company',
|
||||||
|
business_profile={
|
||||||
|
'name': 'Test Business',
|
||||||
|
'mcc': '7299',
|
||||||
|
},
|
||||||
|
metadata={
|
||||||
|
'tenant_id': '1',
|
||||||
|
'tenant_schema': 'test',
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.AccountSession.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_session_for_existing_account(self, mock_settings, mock_session_create):
|
||||||
|
"""Test creates AccountSession for existing Connect account."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectAccountSessionView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_settings.STRIPE_PUBLISHABLE_KEY = 'pk_test_platform'
|
||||||
|
|
||||||
|
mock_session = Mock()
|
||||||
|
mock_session.client_secret = 'cas_secret_existing'
|
||||||
|
mock_session_create.return_value = mock_session
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_existing'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/account-session/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectAccountSessionView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['client_secret'] == 'cas_secret_existing'
|
||||||
|
assert response.data['stripe_account_id'] == 'acct_existing'
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_stripe_error(self, mock_settings, mock_account_create):
|
||||||
|
"""Test handles Stripe API errors."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectAccountSessionView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_account_create.side_effect = stripe.error.StripeError('Account creation failed')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'Test'
|
||||||
|
mock_tenant.schema_name = 'test'
|
||||||
|
mock_tenant.contact_email = None
|
||||||
|
mock_tenant.stripe_connect_id = None
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/account-session/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectAccountSessionView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
assert 'Account creation failed' in response.data['error']
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# ConnectRefreshStatusView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestConnectRefreshStatusView:
|
||||||
|
"""Test ConnectRefreshStatusView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_error_when_no_connect_account(self):
|
||||||
|
"""Test returns 400 when no Connect account exists."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshStatusView
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/refresh-status/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = Mock(stripe_connect_id=None)
|
||||||
|
|
||||||
|
view = ConnectRefreshStatusView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'No Connect account exists' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_updates_status_to_active(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test updates status to active when account is fully enabled."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshStatusView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.charges_enabled = True
|
||||||
|
mock_account.payouts_enabled = True
|
||||||
|
mock_account.details_submitted = True
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.schema_name = 'test'
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
mock_tenant.created_on = datetime(2024, 1, 1)
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/refresh-status/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectRefreshStatusView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert mock_tenant.stripe_charges_enabled is True
|
||||||
|
assert mock_tenant.stripe_payouts_enabled is True
|
||||||
|
assert mock_tenant.stripe_details_submitted is True
|
||||||
|
assert mock_tenant.stripe_connect_status == 'active'
|
||||||
|
assert mock_tenant.stripe_onboarding_complete is True
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_updates_status_to_onboarding(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test updates status to onboarding when details submitted but not enabled."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshStatusView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.charges_enabled = False
|
||||||
|
mock_account.payouts_enabled = False
|
||||||
|
mock_account.details_submitted = True
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.schema_name = 'test'
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
mock_tenant.created_on = None
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/refresh-status/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectRefreshStatusView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert mock_tenant.stripe_connect_status == 'onboarding'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_updates_status_to_pending(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test updates status to pending when details not submitted."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshStatusView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
mock_account = Mock()
|
||||||
|
mock_account.charges_enabled = False
|
||||||
|
mock_account.payouts_enabled = False
|
||||||
|
mock_account.details_submitted = False
|
||||||
|
mock_retrieve.return_value = mock_account
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.name = 'Test'
|
||||||
|
mock_tenant.schema_name = 'test'
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
mock_tenant.created_on = None
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/refresh-status/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectRefreshStatusView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert mock_tenant.stripe_connect_status == 'pending'
|
||||||
|
mock_tenant.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Account.retrieve')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_stripe_error(self, mock_settings, mock_retrieve):
|
||||||
|
"""Test handles Stripe API errors."""
|
||||||
|
from smoothschedule.commerce.payments.views import ConnectRefreshStatusView
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_retrieve.side_effect = stripe.error.StripeError('Retrieval failed')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/connect/refresh-status/')
|
||||||
|
request.user = Mock(is_authenticated=True)
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = ConnectRefreshStatusView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
assert 'Retrieval failed' in response.data['error']
|
||||||
@@ -0,0 +1,890 @@
|
|||||||
|
"""
|
||||||
|
Comprehensive unit tests for Customer Billing and Payment Method Views.
|
||||||
|
|
||||||
|
These tests cover customer-facing payment endpoints for managing payment methods
|
||||||
|
and viewing billing information. All tests use mocks to avoid database access.
|
||||||
|
"""
|
||||||
|
from unittest.mock import Mock, patch, MagicMock
|
||||||
|
from rest_framework.test import APIRequestFactory
|
||||||
|
from rest_framework.request import Request
|
||||||
|
from rest_framework import status
|
||||||
|
import pytest
|
||||||
|
from decimal import Decimal
|
||||||
|
from datetime import datetime
|
||||||
|
import stripe
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# CustomerBillingView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestCustomerBillingView:
|
||||||
|
"""Test CustomerBillingView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_403_for_non_customer(self):
|
||||||
|
"""Test returns 403 when user is not a customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerBillingView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/billing/')
|
||||||
|
request.user = Mock(role=User.Role.TENANT_STAFF, is_authenticated=True)
|
||||||
|
|
||||||
|
view = CustomerBillingView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_403_FORBIDDEN
|
||||||
|
assert 'only for customers' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant.objects.filter')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType.objects.get_for_model')
|
||||||
|
@patch('smoothschedule.commerce.payments.models.TransactionLink.objects.filter')
|
||||||
|
def test_returns_billing_data_for_customer(self, mock_tx_filter, mock_content_type, mock_participant_filter):
|
||||||
|
"""Test returns billing data for authenticated customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerBillingView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
# Mock user content type
|
||||||
|
mock_user_ct = Mock()
|
||||||
|
mock_content_type.return_value = mock_user_ct
|
||||||
|
|
||||||
|
# Mock customer with participations
|
||||||
|
mock_event1 = Mock()
|
||||||
|
mock_event1.id = 1
|
||||||
|
mock_event1.title = 'Haircut'
|
||||||
|
mock_event1.status = 'CONFIRMED'
|
||||||
|
mock_event1.start_time = timezone.now()
|
||||||
|
mock_event1.service = Mock(name='Haircut Service', price=Decimal('25.00'))
|
||||||
|
|
||||||
|
mock_participation1 = Mock()
|
||||||
|
mock_participation1.event_id = 1
|
||||||
|
mock_participation1.event = mock_event1
|
||||||
|
mock_participant_filter.return_value.select_related.return_value = [mock_participation1]
|
||||||
|
|
||||||
|
# Mock completed transaction
|
||||||
|
mock_tx1 = Mock()
|
||||||
|
mock_tx1.id = 1
|
||||||
|
mock_tx1.event_id = 1
|
||||||
|
mock_tx1.event = mock_event1
|
||||||
|
mock_tx1.amount = Decimal('25.00')
|
||||||
|
mock_tx1.currency = 'USD'
|
||||||
|
mock_tx1.status = 'SUCCEEDED'
|
||||||
|
mock_tx1.payment_intent_id = 'pi_123'
|
||||||
|
mock_tx1.created_at = timezone.now()
|
||||||
|
mock_tx1.completed_at = timezone.now()
|
||||||
|
mock_tx_filter.return_value.select_related.return_value.order_by.return_value = [mock_tx1]
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/billing/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.id = 1
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.is_authenticated = True
|
||||||
|
request.user = mock_user
|
||||||
|
|
||||||
|
view = CustomerBillingView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert 'outstanding' in response.data
|
||||||
|
assert 'payment_history' in response.data
|
||||||
|
assert 'summary' in response.data
|
||||||
|
assert len(response.data['payment_history']) == 1
|
||||||
|
assert response.data['payment_history'][0]['amount'] == 25.00
|
||||||
|
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant.objects.filter')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType.objects.get_for_model')
|
||||||
|
@patch('smoothschedule.commerce.payments.models.TransactionLink.objects.filter')
|
||||||
|
def test_includes_outstanding_events(self, mock_tx_filter, mock_content_type, mock_participant_filter):
|
||||||
|
"""Test includes events with pending/no payment in outstanding."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerBillingView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_user_ct = Mock()
|
||||||
|
mock_content_type.return_value = mock_user_ct
|
||||||
|
|
||||||
|
# Mock event with no payment
|
||||||
|
mock_event2 = Mock()
|
||||||
|
mock_event2.id = 2
|
||||||
|
mock_event2.title = 'Massage'
|
||||||
|
mock_event2.status = 'CONFIRMED'
|
||||||
|
mock_event2.start_time = timezone.now()
|
||||||
|
mock_event2.end_time = timezone.now()
|
||||||
|
mock_event2.service = Mock(name='Massage', price=Decimal('75.00'))
|
||||||
|
|
||||||
|
mock_participation2 = Mock()
|
||||||
|
mock_participation2.event = mock_event2
|
||||||
|
mock_participation2.event_id = 2
|
||||||
|
mock_participant_filter.return_value.select_related.return_value = [mock_participation2]
|
||||||
|
|
||||||
|
# No transactions for this event
|
||||||
|
mock_tx_filter.return_value.select_related.return_value.order_by.return_value = []
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/billing/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.id = 1
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
request.user = mock_user
|
||||||
|
|
||||||
|
view = CustomerBillingView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert len(response.data['outstanding']) == 1
|
||||||
|
assert response.data['outstanding'][0]['amount'] == 75.00
|
||||||
|
assert response.data['outstanding'][0]['payment_status'] == 'unpaid'
|
||||||
|
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant.objects.filter')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType.objects.get_for_model')
|
||||||
|
@patch('smoothschedule.commerce.payments.models.TransactionLink.objects.filter')
|
||||||
|
def test_excludes_cancelled_events_from_outstanding(self, mock_tx_filter, mock_content_type, mock_participant_filter):
|
||||||
|
"""Test excludes cancelled events from outstanding."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerBillingView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_user_ct = Mock()
|
||||||
|
mock_content_type.return_value = mock_user_ct
|
||||||
|
|
||||||
|
# Mock cancelled event
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 3
|
||||||
|
mock_event.status = 'CANCELLED'
|
||||||
|
mock_event.service = None
|
||||||
|
|
||||||
|
mock_participation = Mock()
|
||||||
|
mock_participation.event = mock_event
|
||||||
|
mock_participation.event_id = 3
|
||||||
|
mock_participant_filter.return_value.select_related.return_value = [mock_participation]
|
||||||
|
|
||||||
|
mock_tx_filter.return_value.select_related.return_value.order_by.return_value = []
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/billing/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.id = 1
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
request.user = mock_user
|
||||||
|
|
||||||
|
view = CustomerBillingView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
# Cancelled event should not be in outstanding
|
||||||
|
assert len(response.data['outstanding']) == 0
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# CustomerPaymentMethodsView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestCustomerPaymentMethodsView:
|
||||||
|
"""Test CustomerPaymentMethodsView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_403_for_non_customer(self):
|
||||||
|
"""Test returns 403 when user is not a customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodsView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/payment-methods/')
|
||||||
|
request.user = Mock(role=User.Role.TENANT_STAFF, is_authenticated=True)
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodsView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_403_FORBIDDEN
|
||||||
|
assert 'only for customers' in response.data['error']
|
||||||
|
|
||||||
|
def test_returns_empty_when_no_stripe_customer_id(self):
|
||||||
|
"""Test returns empty list when user has no Stripe customer ID."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodsView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/payment-methods/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = None
|
||||||
|
request.user = mock_user
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodsView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['payment_methods'] == []
|
||||||
|
assert response.data['has_stripe_customer'] is False
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_returns_payment_methods_list(self, mock_get_service):
|
||||||
|
"""Test returns list of payment methods from Stripe."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodsView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
# Mock Stripe service
|
||||||
|
mock_pm1 = Mock()
|
||||||
|
mock_pm1.id = 'pm_123'
|
||||||
|
mock_pm1.type = 'card'
|
||||||
|
mock_pm1.card = Mock(brand='visa', last4='4242', exp_month=12, exp_year=2025)
|
||||||
|
|
||||||
|
mock_pm_list = Mock()
|
||||||
|
mock_pm_list.data = [mock_pm1]
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.return_value = mock_pm_list
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/payment-methods/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
mock_user.default_payment_method_id = 'pm_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodsView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['has_stripe_customer'] is True
|
||||||
|
assert len(response.data['payment_methods']) == 1
|
||||||
|
assert response.data['payment_methods'][0]['brand'] == 'visa'
|
||||||
|
assert response.data['payment_methods'][0]['last4'] == '4242'
|
||||||
|
assert response.data['payment_methods'][0]['is_default'] is True
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_stripe_not_configured(self, mock_get_service):
|
||||||
|
"""Test handles when Stripe is not configured for tenant."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodsView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_get_service.side_effect = ValueError('Stripe not configured')
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/payment-methods/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodsView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['payment_methods'] == []
|
||||||
|
assert response.data['has_stripe_customer'] is False
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_generic_exception(self, mock_get_service):
|
||||||
|
"""Test handles generic exceptions gracefully."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodsView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.side_effect = Exception('Network error')
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.get('/payments/customer/payment-methods/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodsView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.get(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['payment_methods'] == []
|
||||||
|
assert 'try again later' in response.data['message']
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# CustomerSetupIntentView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestCustomerSetupIntentView:
|
||||||
|
"""Test CustomerSetupIntentView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_403_for_non_customer(self):
|
||||||
|
"""Test returns 403 when user is not a customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
request.user = Mock(role=User.Role.TENANT_OWNER, is_authenticated=True)
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_403_FORBIDDEN
|
||||||
|
assert 'only for customers' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.SetupIntent.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Customer.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_setup_intent_direct_api_mode(self, mock_settings, mock_customer_create, mock_setup_create):
|
||||||
|
"""Test creates SetupIntent for direct_api mode."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
# Mock customer creation
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.id = 'cus_new123'
|
||||||
|
mock_customer_create.return_value = mock_customer
|
||||||
|
|
||||||
|
# Mock SetupIntent creation
|
||||||
|
mock_setup_intent = Mock()
|
||||||
|
mock_setup_intent.id = 'seti_123'
|
||||||
|
mock_setup_intent.client_secret = 'seti_secret_abc123'
|
||||||
|
mock_setup_create.return_value = mock_setup_intent
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'direct_api'
|
||||||
|
mock_tenant.stripe_secret_key = 'sk_test_tenant'
|
||||||
|
mock_tenant.stripe_publishable_key = 'pk_test_tenant'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.id = 1
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.email = 'customer@example.com'
|
||||||
|
mock_user.get_full_name.return_value = 'John Doe'
|
||||||
|
mock_user.username = 'johndoe'
|
||||||
|
mock_user.stripe_customer_id = None # No existing customer
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['client_secret'] == 'seti_secret_abc123'
|
||||||
|
assert response.data['setup_intent_id'] == 'seti_123'
|
||||||
|
assert response.data['customer_id'] == 'cus_new123'
|
||||||
|
assert response.data['stripe_account'] == '' # Empty for direct_api
|
||||||
|
assert response.data['publishable_key'] == 'pk_test_tenant'
|
||||||
|
assert mock_user.stripe_customer_id == 'cus_new123'
|
||||||
|
mock_user.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_creates_setup_intent_connect_mode(self, mock_settings, mock_get_service):
|
||||||
|
"""Test creates SetupIntent for connect mode."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
|
||||||
|
# Mock Stripe service
|
||||||
|
mock_setup_intent = Mock()
|
||||||
|
mock_setup_intent.id = 'seti_connect'
|
||||||
|
mock_setup_intent.client_secret = 'seti_secret_connect'
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.create_or_get_customer.return_value = 'cus_connect123'
|
||||||
|
mock_service.create_setup_intent.return_value = mock_setup_intent
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'connect'
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['client_secret'] == 'seti_secret_connect'
|
||||||
|
assert response.data['stripe_account'] == 'acct_123'
|
||||||
|
assert response.data['publishable_key'] is None # None for connect mode
|
||||||
|
|
||||||
|
def test_returns_400_when_payment_not_configured(self):
|
||||||
|
"""Test returns 400 when payment is not configured."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'none'
|
||||||
|
mock_tenant.name = 'Test'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
assert 'not available' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_value_error(self, mock_get_service):
|
||||||
|
"""Test handles ValueError from get_stripe_service_for_tenant."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_get_service.side_effect = ValueError('Stripe not configured')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'connect'
|
||||||
|
mock_tenant.stripe_connect_id = 'acct_123'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.Customer.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_stripe_error(self, mock_settings, mock_customer_create):
|
||||||
|
"""Test handles StripeError."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_customer_create.side_effect = stripe.error.StripeError('API error')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'direct_api'
|
||||||
|
mock_tenant.stripe_secret_key = 'sk_test_tenant'
|
||||||
|
mock_tenant.name = 'Test'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.email = 'test@example.com'
|
||||||
|
mock_user.get_full_name.return_value = 'Test'
|
||||||
|
mock_user.username = 'test'
|
||||||
|
mock_user.stripe_customer_id = None
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.stripe.SetupIntent.create')
|
||||||
|
@patch('smoothschedule.commerce.payments.views.settings')
|
||||||
|
def test_handles_generic_exception(self, mock_settings, mock_setup_create):
|
||||||
|
"""Test handles generic exceptions."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerSetupIntentView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_settings.STRIPE_SECRET_KEY = 'sk_test_platform'
|
||||||
|
mock_setup_create.side_effect = Exception('Unexpected error')
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.payment_mode = 'direct_api'
|
||||||
|
mock_tenant.stripe_secret_key = 'sk_test_tenant'
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/setup-intent/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_existing'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = mock_tenant
|
||||||
|
|
||||||
|
view = CustomerSetupIntentView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request)
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# CustomerPaymentMethodDeleteView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestCustomerPaymentMethodDeleteView:
|
||||||
|
"""Test CustomerPaymentMethodDeleteView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_403_for_non_customer(self):
|
||||||
|
"""Test returns 403 when user is not a customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDeleteView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/customer/payment-methods/pm_123/')
|
||||||
|
request.user = Mock(role=User.Role.TENANT_STAFF, is_authenticated=True)
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_403_FORBIDDEN
|
||||||
|
|
||||||
|
def test_returns_404_when_no_stripe_customer_id(self):
|
||||||
|
"""Test returns 404 when customer has no Stripe ID."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDeleteView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/customer/payment-methods/pm_123/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = None
|
||||||
|
request.user = mock_user
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||||
|
assert 'No payment methods on file' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_deletes_payment_method_successfully(self, mock_get_service):
|
||||||
|
"""Test successfully deletes a payment method."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDeleteView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
# Mock payment methods list
|
||||||
|
mock_pm1 = Mock()
|
||||||
|
mock_pm1.id = 'pm_123'
|
||||||
|
mock_pm_list = Mock()
|
||||||
|
mock_pm_list.data = [mock_pm1]
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.return_value = mock_pm_list
|
||||||
|
mock_service.detach_payment_method.return_value = None
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/customer/payment-methods/pm_123/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['success'] is True
|
||||||
|
mock_service.detach_payment_method.assert_called_once_with('pm_123')
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_returns_404_for_nonexistent_payment_method(self, mock_get_service):
|
||||||
|
"""Test returns 404 when payment method doesn't belong to customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDeleteView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_pm_list = Mock()
|
||||||
|
mock_pm_list.data = [] # No payment methods
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.return_value = mock_pm_list
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/customer/payment-methods/pm_999/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request, payment_method_id='pm_999')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||||
|
assert 'Payment method not found' in response.data['error']
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_stripe_not_configured(self, mock_get_service):
|
||||||
|
"""Test handles when Stripe is not configured."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDeleteView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_get_service.side_effect = ValueError('Not configured')
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/customer/payment-methods/pm_123/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_generic_exception(self, mock_get_service):
|
||||||
|
"""Test handles generic exceptions gracefully."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDeleteView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.side_effect = Exception('Error')
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.delete('/payments/customer/payment-methods/pm_123/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDeleteView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.delete(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# CustomerPaymentMethodDefaultView Tests
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class TestCustomerPaymentMethodDefaultView:
|
||||||
|
"""Test CustomerPaymentMethodDefaultView comprehensively."""
|
||||||
|
|
||||||
|
def test_returns_403_for_non_customer(self):
|
||||||
|
"""Test returns 403 when user is not a customer."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDefaultView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/payment-methods/pm_123/default/')
|
||||||
|
request.user = Mock(role=User.Role.TENANT_STAFF, is_authenticated=True)
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDefaultView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_403_FORBIDDEN
|
||||||
|
|
||||||
|
def test_returns_404_when_no_stripe_customer_id(self):
|
||||||
|
"""Test returns 404 when customer has no Stripe ID."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDefaultView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/payment-methods/pm_123/default/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = None
|
||||||
|
request.user = mock_user
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDefaultView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_sets_default_payment_method_successfully(self, mock_get_service):
|
||||||
|
"""Test successfully sets default payment method."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDefaultView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
# Mock payment methods list
|
||||||
|
mock_pm1 = Mock()
|
||||||
|
mock_pm1.id = 'pm_123'
|
||||||
|
mock_pm_list = Mock()
|
||||||
|
mock_pm_list.data = [mock_pm1]
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.return_value = mock_pm_list
|
||||||
|
mock_service.set_default_payment_method.return_value = None
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/payment-methods/pm_123/default/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
mock_user.default_payment_method_id = None
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDefaultView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert response.data['success'] is True
|
||||||
|
assert mock_user.default_payment_method_id == 'pm_123'
|
||||||
|
mock_user.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_returns_404_for_nonexistent_payment_method(self, mock_get_service):
|
||||||
|
"""Test returns 404 when payment method doesn't exist."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDefaultView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_pm_list = Mock()
|
||||||
|
mock_pm_list.data = [] # No payment methods
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.return_value = mock_pm_list
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/payment-methods/pm_999/default/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDefaultView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request, payment_method_id='pm_999')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_404_NOT_FOUND
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_stripe_not_configured(self, mock_get_service):
|
||||||
|
"""Test handles when Stripe is not configured."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDefaultView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_get_service.side_effect = ValueError('Not configured')
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/payment-methods/pm_123/default/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDefaultView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.payments.views.get_stripe_service_for_tenant')
|
||||||
|
def test_handles_generic_exception(self, mock_get_service):
|
||||||
|
"""Test handles generic exceptions gracefully."""
|
||||||
|
from smoothschedule.commerce.payments.views import CustomerPaymentMethodDefaultView
|
||||||
|
from smoothschedule.identity.users.models import User
|
||||||
|
|
||||||
|
mock_service = Mock()
|
||||||
|
mock_service.list_payment_methods.side_effect = Exception('Error')
|
||||||
|
mock_get_service.return_value = mock_service
|
||||||
|
|
||||||
|
factory = APIRequestFactory()
|
||||||
|
request = factory.post('/payments/customer/payment-methods/pm_123/default/')
|
||||||
|
mock_user = Mock()
|
||||||
|
mock_user.role = User.Role.CUSTOMER
|
||||||
|
mock_user.stripe_customer_id = 'cus_123'
|
||||||
|
request.user = mock_user
|
||||||
|
request.tenant = Mock()
|
||||||
|
|
||||||
|
view = CustomerPaymentMethodDefaultView()
|
||||||
|
|
||||||
|
# Act
|
||||||
|
response = view.post(request, payment_method_id='pm_123')
|
||||||
|
|
||||||
|
# Assert
|
||||||
|
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||||
@@ -0,0 +1,366 @@
|
|||||||
|
"""
|
||||||
|
Additional unit tests for email processing logic in email_receiver.py.
|
||||||
|
|
||||||
|
Focuses on process_single_email, fetch_and_process, create_ticket logic.
|
||||||
|
"""
|
||||||
|
from unittest.mock import Mock, patch, MagicMock, PropertyMock
|
||||||
|
import pytest
|
||||||
|
from email.message import EmailMessage
|
||||||
|
|
||||||
|
|
||||||
|
class TestFetchAndProcessEmailsFlow:
|
||||||
|
"""Tests for complete fetch and process flow."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver.disconnect')
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver.connect')
|
||||||
|
def test_updates_last_check_time_on_success(self, mock_connect, mock_disconnect):
|
||||||
|
"""Should update last_check_at and emails_processed_count on success."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
mock_connect.return_value = True
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_imap_configured = True
|
||||||
|
mock_email_address.is_smtp_configured = True
|
||||||
|
mock_email_address.is_active = True
|
||||||
|
mock_email_address.imap_folder = 'INBOX'
|
||||||
|
mock_email_address.emails_processed_count = 5
|
||||||
|
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Mock connection
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.select.return_value = ('OK', None)
|
||||||
|
mock_conn.search.return_value = ('OK', [b'1 2'])
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
with patch.object(receiver, '_process_single_email', return_value=True):
|
||||||
|
result = receiver.fetch_and_process_emails()
|
||||||
|
|
||||||
|
# Check last_check_at was set
|
||||||
|
assert mock_email_address.last_check_at is not None
|
||||||
|
# Check last_error was cleared
|
||||||
|
assert mock_email_address.last_error == ''
|
||||||
|
# Check processed count was incremented
|
||||||
|
assert mock_email_address.emails_processed_count == 7
|
||||||
|
mock_email_address.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver.disconnect')
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver.connect')
|
||||||
|
def test_updates_error_on_exception(self, mock_connect, mock_disconnect):
|
||||||
|
"""Should update last_error when exception occurs."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_connect.return_value = True
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_imap_configured = True
|
||||||
|
mock_email_address.is_smtp_configured = True
|
||||||
|
mock_email_address.is_active = True
|
||||||
|
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Mock connection that raises
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.select.side_effect = Exception("Server error")
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
result = receiver.fetch_and_process_emails()
|
||||||
|
|
||||||
|
# Should have updated error
|
||||||
|
assert mock_email_address.last_error != ''
|
||||||
|
assert 'Server error' in mock_email_address.last_error
|
||||||
|
mock_email_address.save.assert_called()
|
||||||
|
assert result == 0
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver.disconnect')
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver.connect')
|
||||||
|
def test_continues_processing_after_single_email_error(self, mock_connect, mock_disconnect):
|
||||||
|
"""Should continue processing other emails if one fails."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_connect.return_value = True
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_imap_configured = True
|
||||||
|
mock_email_address.is_smtp_configured = True
|
||||||
|
mock_email_address.is_active = True
|
||||||
|
mock_email_address.imap_folder = 'INBOX'
|
||||||
|
mock_email_address.emails_processed_count = 0
|
||||||
|
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.select.return_value = ('OK', None)
|
||||||
|
mock_conn.search.return_value = ('OK', [b'1 2 3 4'])
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
# 1st succeeds, 2nd raises, 3rd succeeds, 4th returns False
|
||||||
|
with patch.object(receiver, '_process_single_email', side_effect=[True, Exception("Error"), True, False]):
|
||||||
|
result = receiver.fetch_and_process_emails()
|
||||||
|
|
||||||
|
# Should have processed 2 successfully (1st and 3rd)
|
||||||
|
assert result == 2
|
||||||
|
assert mock_email_address.emails_processed_count == 2
|
||||||
|
|
||||||
|
|
||||||
|
class TestProcessSingleEmailEdgeCases:
|
||||||
|
"""Tests for edge cases in _process_single_email."""
|
||||||
|
|
||||||
|
def test_handles_fetch_status_not_ok(self):
|
||||||
|
"""Should return False when fetch status is not OK."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.fetch.return_value = ('NO', [])
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
result = receiver._process_single_email(b'1')
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver._extract_email_data')
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver._delete_email')
|
||||||
|
@patch('smoothschedule.commerce.tickets.models.IncomingTicketEmail.objects.filter')
|
||||||
|
def test_deletes_email_to_noreply(self, mock_filter, mock_delete, mock_extract):
|
||||||
|
"""Should delete emails sent to noreply@smoothschedule.com."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_filter.return_value.exists.return_value = False
|
||||||
|
|
||||||
|
mock_extract.return_value = {
|
||||||
|
'to_address': 'noreply@smoothschedule.com',
|
||||||
|
'from_address': 'user@example.com',
|
||||||
|
'message_id': '<test@example.com>',
|
||||||
|
}
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg.set_content('test')
|
||||||
|
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.fetch.return_value = ('OK', [(None, msg.as_bytes())])
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
result = receiver._process_single_email(b'1')
|
||||||
|
|
||||||
|
mock_delete.assert_called_once_with(b'1')
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.TicketEmailReceiver._extract_email_data')
|
||||||
|
@patch('smoothschedule.commerce.tickets.models.IncomingTicketEmail.objects.filter')
|
||||||
|
def test_returns_false_for_duplicate_email(self, mock_filter, mock_extract):
|
||||||
|
"""Should return False when message ID already exists."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Mock duplicate exists
|
||||||
|
mock_filter.return_value.exists.return_value = True
|
||||||
|
|
||||||
|
mock_extract.return_value = {
|
||||||
|
'to_address': 'support@example.com',
|
||||||
|
'from_address': 'user@example.com',
|
||||||
|
'message_id': '<duplicate@example.com>',
|
||||||
|
}
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg.set_content('test')
|
||||||
|
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.fetch.return_value = ('OK', [(None, msg.as_bytes())])
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
result = receiver._process_single_email(b'1')
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
|
||||||
|
class TestCreateNewTicketFromEmailError:
|
||||||
|
"""Tests for _create_new_ticket_from_email error handling."""
|
||||||
|
|
||||||
|
def test_handles_ticket_creation_failure(self):
|
||||||
|
"""Should return False and mark incoming email as failed."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_incoming = Mock()
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'subject': 'Test',
|
||||||
|
'body_text': 'Body',
|
||||||
|
'body_html': '',
|
||||||
|
'extracted_reply': '',
|
||||||
|
'from_address': 'user@example.com',
|
||||||
|
'from_name': 'User',
|
||||||
|
}
|
||||||
|
|
||||||
|
with patch('smoothschedule.commerce.tickets.models.Ticket.objects.create', side_effect=Exception("DB error")):
|
||||||
|
result = receiver._create_new_ticket_from_email(email_data, mock_incoming, None)
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
mock_incoming.mark_failed.assert_called_once()
|
||||||
|
|
||||||
|
|
||||||
|
class TestPlatformEmailReceiverCreateTicketError:
|
||||||
|
"""Tests for PlatformEmailReceiver ticket creation error handling."""
|
||||||
|
|
||||||
|
def test_handles_ticket_creation_failure(self):
|
||||||
|
"""Should return False and mark incoming email as failed."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_incoming = Mock()
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'subject': 'Test',
|
||||||
|
'body_text': 'Body',
|
||||||
|
'body_html': '',
|
||||||
|
'extracted_reply': '',
|
||||||
|
'from_address': 'user@example.com',
|
||||||
|
'from_name': 'User',
|
||||||
|
}
|
||||||
|
|
||||||
|
with patch('smoothschedule.commerce.tickets.models.Ticket.objects.create', side_effect=Exception("DB error")):
|
||||||
|
result = receiver._create_new_ticket_from_email(email_data, mock_incoming, None)
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
mock_incoming.mark_failed.assert_called_once()
|
||||||
|
|
||||||
|
|
||||||
|
class TestExtractEmailDataFields:
|
||||||
|
"""Tests for various fields extracted in _extract_email_data."""
|
||||||
|
|
||||||
|
def test_generates_message_id_when_missing(self):
|
||||||
|
"""Should generate message ID when not in email."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg['From'] = 'test@example.com'
|
||||||
|
msg['To'] = 'support@example.com'
|
||||||
|
msg['Date'] = 'Mon, 1 Jan 2024 12:00:00 +0000'
|
||||||
|
msg.set_content('test')
|
||||||
|
# No Message-ID
|
||||||
|
|
||||||
|
result = receiver._extract_email_data(msg)
|
||||||
|
|
||||||
|
assert result['message_id'].startswith('generated-')
|
||||||
|
|
||||||
|
def test_handles_invalid_date_string(self):
|
||||||
|
"""Should use current time when date string is invalid."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg['From'] = 'test@example.com'
|
||||||
|
msg['To'] = 'support@example.com'
|
||||||
|
msg['Message-ID'] = '<test@example.com>'
|
||||||
|
msg['Date'] = 'Not A Valid Date'
|
||||||
|
msg.set_content('test')
|
||||||
|
|
||||||
|
result = receiver._extract_email_data(msg)
|
||||||
|
|
||||||
|
# Should have a datetime (current time)
|
||||||
|
assert result['date'] is not None
|
||||||
|
from datetime import datetime
|
||||||
|
assert isinstance(result['date'], datetime)
|
||||||
|
|
||||||
|
def test_extracts_all_relevant_headers(self):
|
||||||
|
"""Should extract all relevant headers into headers dict."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg['From'] = 'test@example.com'
|
||||||
|
msg['To'] = 'support@example.com'
|
||||||
|
msg['Subject'] = 'Test'
|
||||||
|
msg['Message-ID'] = '<test@example.com>'
|
||||||
|
msg['Date'] = 'Mon, 1 Jan 2024 12:00:00 +0000'
|
||||||
|
msg['In-Reply-To'] = '<original@example.com>'
|
||||||
|
msg['References'] = '<ref1@example.com> <ref2@example.com>'
|
||||||
|
msg['X-Ticket-ID'] = '123'
|
||||||
|
msg.set_content('test')
|
||||||
|
|
||||||
|
result = receiver._extract_email_data(msg)
|
||||||
|
|
||||||
|
headers = result['headers']
|
||||||
|
assert headers['from'] == 'test@example.com'
|
||||||
|
assert headers['to'] == 'support@example.com'
|
||||||
|
assert headers['subject'] == 'Test'
|
||||||
|
assert headers['in-reply-to'] == '<original@example.com>'
|
||||||
|
assert headers['references'] == '<ref1@example.com> <ref2@example.com>'
|
||||||
|
assert headers['x-ticket-id'] == '123'
|
||||||
|
|
||||||
|
|
||||||
|
class TestPlatformReceiverFetchFlow:
|
||||||
|
"""Tests for PlatformEmailReceiver fetch flow."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.PlatformEmailReceiver.disconnect')
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.PlatformEmailReceiver.connect')
|
||||||
|
def test_updates_check_time_on_success(self, mock_connect, mock_disconnect):
|
||||||
|
"""Should update last_check_at and processed count."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_connect.return_value = True
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_active = True
|
||||||
|
mock_email_address.display_name = 'Test'
|
||||||
|
mock_email_address.emails_processed_count = 10
|
||||||
|
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.select.return_value = ('OK', None)
|
||||||
|
mock_conn.search.return_value = ('OK', [b'1'])
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
with patch.object(receiver, '_process_single_email', return_value=True):
|
||||||
|
result = receiver.fetch_and_process_emails()
|
||||||
|
|
||||||
|
assert result == 1
|
||||||
|
assert mock_email_address.last_check_at is not None
|
||||||
|
assert mock_email_address.emails_processed_count == 11
|
||||||
|
mock_email_address.save.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.PlatformEmailReceiver.disconnect')
|
||||||
|
@patch('smoothschedule.commerce.tickets.email_receiver.PlatformEmailReceiver.connect')
|
||||||
|
def test_updates_error_on_exception(self, mock_connect, mock_disconnect):
|
||||||
|
"""Should update last_sync_error when exception occurs."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_connect.return_value = True
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_active = True
|
||||||
|
mock_email_address.display_name = 'Test'
|
||||||
|
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_conn = Mock()
|
||||||
|
mock_conn.select.side_effect = Exception("Error")
|
||||||
|
receiver.connection = mock_conn
|
||||||
|
|
||||||
|
result = receiver.fetch_and_process_emails()
|
||||||
|
|
||||||
|
assert result == 0
|
||||||
|
assert mock_email_address.last_sync_error != ''
|
||||||
|
mock_disconnect.assert_called_once()
|
||||||
@@ -0,0 +1,549 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for email_receiver.py focusing on uncovered lines.
|
||||||
|
|
||||||
|
Uses mocks extensively to avoid database access.
|
||||||
|
"""
|
||||||
|
from unittest.mock import Mock, patch, MagicMock, call
|
||||||
|
import pytest
|
||||||
|
import email
|
||||||
|
from email.message import EmailMessage
|
||||||
|
from datetime import datetime
|
||||||
|
import imaplib
|
||||||
|
|
||||||
|
|
||||||
|
class TestExtractEmailDataWithBody:
|
||||||
|
"""Tests for _extract_email_data body extraction logic."""
|
||||||
|
|
||||||
|
def test_extracts_reply_from_body_text(self):
|
||||||
|
"""Should call _extract_reply_text on body_text."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg['From'] = 'john@example.com'
|
||||||
|
msg['To'] = 'support@example.com'
|
||||||
|
msg['Subject'] = 'Test'
|
||||||
|
msg['Message-ID'] = '<test@example.com>'
|
||||||
|
msg['Date'] = 'Mon, 1 Jan 2024 12:00:00 +0000'
|
||||||
|
msg.set_content('Reply text\n\nOn Jan 1 wrote:\n> quoted')
|
||||||
|
|
||||||
|
with patch.object(receiver, '_extract_reply_text', return_value='Reply text') as mock_extract:
|
||||||
|
result = receiver._extract_email_data(msg)
|
||||||
|
|
||||||
|
mock_extract.assert_called_once()
|
||||||
|
assert result['extracted_reply'] == 'Reply text'
|
||||||
|
|
||||||
|
|
||||||
|
class TestExtractReplyTextPatterns:
|
||||||
|
"""Tests for quote pattern matching in _extract_reply_text."""
|
||||||
|
|
||||||
|
def test_strips_outlook_formatted_from(self):
|
||||||
|
"""Should remove *From:* formatted Outlook quotes."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
text = "My reply\n\n*From:* Someone\nQuoted text"
|
||||||
|
result = receiver._extract_reply_text(text)
|
||||||
|
|
||||||
|
assert 'My reply' in result
|
||||||
|
assert '*From:*' not in result
|
||||||
|
|
||||||
|
def test_strips_sent_from_my(self):
|
||||||
|
"""Should remove 'Sent from my' mobile signatures."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
text = "Reply here\nSent from my iPhone"
|
||||||
|
result = receiver._extract_reply_text(text)
|
||||||
|
|
||||||
|
assert 'Reply here' in result
|
||||||
|
assert 'Sent from my' not in result
|
||||||
|
|
||||||
|
def test_strips_get_outlook_for(self):
|
||||||
|
"""Should remove 'Get Outlook for' signatures."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
text = "My message\nGet Outlook for iOS"
|
||||||
|
result = receiver._extract_reply_text(text)
|
||||||
|
|
||||||
|
assert 'My message' in result
|
||||||
|
assert 'Get Outlook' not in result
|
||||||
|
|
||||||
|
def test_strips_underscore_separator(self):
|
||||||
|
"""Should stop at underscore separator lines."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
text = "Reply\n___________\nOriginal message below"
|
||||||
|
result = receiver._extract_reply_text(text)
|
||||||
|
|
||||||
|
assert 'Reply' in result
|
||||||
|
assert 'Original message' not in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestFindMatchingTicketEdgeCases:
|
||||||
|
"""Tests for edge cases in _find_matching_ticket."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.models.Ticket.objects.get')
|
||||||
|
def test_handles_value_error_in_ticket_id(self, mock_get):
|
||||||
|
"""Should handle ValueError when ticket ID is not a valid integer."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_get.side_effect = ValueError("invalid literal")
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'ticket_id': 'abc', # Not a number
|
||||||
|
'headers': {},
|
||||||
|
'from_address': 'user@example.com',
|
||||||
|
}
|
||||||
|
|
||||||
|
with patch('smoothschedule.identity.users.models.User.objects.filter') as mock_user:
|
||||||
|
mock_user.return_value.first.return_value = None
|
||||||
|
result = receiver._find_matching_ticket(email_data)
|
||||||
|
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.users.models.User.objects.filter')
|
||||||
|
def test_handles_exception_finding_user_ticket(self, mock_user_filter):
|
||||||
|
"""Should handle exceptions when looking up user's recent tickets."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_user_filter.side_effect = Exception("Database error")
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'ticket_id': '',
|
||||||
|
'headers': {},
|
||||||
|
'from_address': 'user@example.com',
|
||||||
|
}
|
||||||
|
|
||||||
|
result = receiver._find_matching_ticket(email_data)
|
||||||
|
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
|
||||||
|
class TestHtmlToTextEdgeCases:
|
||||||
|
"""Tests for HTML to text conversion edge cases."""
|
||||||
|
|
||||||
|
def test_converts_multiple_paragraphs(self):
|
||||||
|
"""Should handle multiple paragraph tags."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
html = '<p>First</p><p>Second</p><p>Third</p>'
|
||||||
|
result = receiver._html_to_text(html)
|
||||||
|
|
||||||
|
assert 'First' in result
|
||||||
|
assert 'Second' in result
|
||||||
|
assert 'Third' in result
|
||||||
|
assert '<p>' not in result
|
||||||
|
|
||||||
|
def test_handles_mixed_case_tags(self):
|
||||||
|
"""Should handle mixed case HTML tags."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
html = '<SCRIPT>bad</SCRIPT><P>Good</P><BR/><style>css</style>'
|
||||||
|
result = receiver._html_to_text(html)
|
||||||
|
|
||||||
|
assert 'bad' not in result
|
||||||
|
assert 'css' not in result
|
||||||
|
assert 'Good' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestDecodeHeaderWithCharsets:
|
||||||
|
"""Tests for header decoding with different charsets."""
|
||||||
|
|
||||||
|
def test_handles_multiple_decoded_parts(self):
|
||||||
|
"""Should join multiple decoded header parts."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Test with a normal string (non-encoded)
|
||||||
|
result = receiver._decode_header('Part 1 Part 2 Part 3')
|
||||||
|
|
||||||
|
assert 'Part 1' in result
|
||||||
|
assert 'Part 2' in result
|
||||||
|
assert 'Part 3' in result
|
||||||
|
|
||||||
|
def test_handles_decode_exception_with_fallback(self):
|
||||||
|
"""Should use error='replace' when decoding fails."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Test with empty string - should return empty
|
||||||
|
result = receiver._decode_header('')
|
||||||
|
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestExtractBodyWithErrors:
|
||||||
|
"""Tests for body extraction error handling."""
|
||||||
|
|
||||||
|
def test_handles_charset_decode_error_in_multipart(self):
|
||||||
|
"""Should handle charset decode errors in multipart emails."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
from email.mime.multipart import MIMEMultipart
|
||||||
|
from email.mime.text import MIMEText
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = MIMEMultipart()
|
||||||
|
part = MIMEText('Test', 'plain', 'utf-8')
|
||||||
|
|
||||||
|
# Mock get_payload to return bytes that can't decode
|
||||||
|
with patch.object(part, 'get_payload', return_value=b'\xff\xfe'):
|
||||||
|
with patch.object(part, 'get_content_charset', return_value='utf-8'):
|
||||||
|
msg.attach(part)
|
||||||
|
text_body, html_body = receiver._extract_body(msg)
|
||||||
|
|
||||||
|
# Should not crash, errors='replace' should handle it
|
||||||
|
assert isinstance(text_body, str)
|
||||||
|
|
||||||
|
def test_handles_none_body_in_multipart(self):
|
||||||
|
"""Should handle None body from get_payload."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
from email.mime.multipart import MIMEMultipart
|
||||||
|
from email.mime.base import MIMEBase
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = MIMEMultipart()
|
||||||
|
part = MIMEBase('text', 'plain')
|
||||||
|
|
||||||
|
with patch.object(part, 'get_payload', return_value=None):
|
||||||
|
msg.attach(part)
|
||||||
|
text_body, html_body = receiver._extract_body(msg)
|
||||||
|
|
||||||
|
assert text_body == ''
|
||||||
|
assert html_body == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestPlatformEmailReceiverMethods:
|
||||||
|
"""Tests for PlatformEmailReceiver specific methods."""
|
||||||
|
|
||||||
|
def test_extract_reply_text_with_empty_returns_empty(self):
|
||||||
|
"""Should return empty string for None input."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
result = receiver._extract_reply_text(None)
|
||||||
|
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_extract_reply_text_strips_quotes(self):
|
||||||
|
"""Should strip quoted content from reply."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
text = "My reply\n> quoted line"
|
||||||
|
result = receiver._extract_reply_text(text)
|
||||||
|
|
||||||
|
assert 'My reply' in result
|
||||||
|
assert 'quoted line' not in result
|
||||||
|
|
||||||
|
def test_html_to_text_converts_entities(self):
|
||||||
|
"""Should convert HTML entities."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
html = '<div>&"test"</div>'
|
||||||
|
result = receiver._html_to_text(html)
|
||||||
|
|
||||||
|
assert '<div>' in result
|
||||||
|
assert '&' in result
|
||||||
|
assert '"test"' in result
|
||||||
|
|
||||||
|
def test_extract_body_multipart_without_attachment(self):
|
||||||
|
"""Should extract multipart email body correctly."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
from email.mime.multipart import MIMEMultipart
|
||||||
|
from email.mime.text import MIMEText
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = MIMEMultipart()
|
||||||
|
msg.attach(MIMEText('Plain text', 'plain'))
|
||||||
|
msg.attach(MIMEText('<p>HTML</p>', 'html'))
|
||||||
|
|
||||||
|
text_body, html_body = receiver._extract_body(msg)
|
||||||
|
|
||||||
|
assert 'Plain text' in text_body
|
||||||
|
assert '<p>HTML</p>' in html_body
|
||||||
|
|
||||||
|
def test_extract_body_single_part_html(self):
|
||||||
|
"""Should extract single-part HTML email."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = EmailMessage()
|
||||||
|
msg.set_content('<p>HTML content</p>', subtype='html')
|
||||||
|
|
||||||
|
text_body, html_body = receiver._extract_body(msg)
|
||||||
|
|
||||||
|
assert html_body != ''
|
||||||
|
assert '<p>HTML content</p>' in html_body
|
||||||
|
|
||||||
|
def test_extract_body_converts_html_when_no_text(self):
|
||||||
|
"""Should convert HTML to text when no text part available."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
from email.mime.multipart import MIMEMultipart
|
||||||
|
from email.mime.text import MIMEText
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
msg = MIMEMultipart()
|
||||||
|
msg.attach(MIMEText('<p>HTML only</p>', 'html'))
|
||||||
|
|
||||||
|
text_body, html_body = receiver._extract_body(msg)
|
||||||
|
|
||||||
|
# Should have converted HTML to text
|
||||||
|
assert 'HTML only' in text_body
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.models.Ticket.objects.get')
|
||||||
|
def test_find_matching_ticket_returns_none(self, mock_get):
|
||||||
|
"""Should return None when no ticket found."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
from smoothschedule.commerce.tickets.models import Ticket
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_get.side_effect = Ticket.DoesNotExist()
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'ticket_id': '999',
|
||||||
|
'headers': {},
|
||||||
|
}
|
||||||
|
|
||||||
|
result = receiver._find_matching_ticket(email_data)
|
||||||
|
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.models.Ticket.objects.get')
|
||||||
|
def test_find_matching_ticket_by_references_header(self, mock_get):
|
||||||
|
"""Should try to find ticket ID in References header."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
from smoothschedule.commerce.tickets.models import Ticket
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Mock DoesNotExist for all attempts
|
||||||
|
mock_get.side_effect = Ticket.DoesNotExist()
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'ticket_id': '',
|
||||||
|
'headers': {
|
||||||
|
'x-ticket-id': '',
|
||||||
|
'in-reply-to': '',
|
||||||
|
'references': '<ticket-555@example.com>',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
result = receiver._find_matching_ticket(email_data)
|
||||||
|
|
||||||
|
# PlatformEmailReceiver doesn't look up by user, so returns None
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
@patch('smoothschedule.commerce.tickets.models.Ticket.objects.get')
|
||||||
|
def test_find_matching_ticket_handles_value_error(self, mock_get):
|
||||||
|
"""Should handle ValueError in references parsing."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_get.side_effect = ValueError("Invalid")
|
||||||
|
|
||||||
|
email_data = {
|
||||||
|
'ticket_id': '',
|
||||||
|
'headers': {
|
||||||
|
'in-reply-to': 'ticket-abc',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
result = receiver._find_matching_ticket(email_data)
|
||||||
|
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
|
||||||
|
class TestConnectErrorHandling:
|
||||||
|
"""Tests for connection error handling in both receivers."""
|
||||||
|
|
||||||
|
@patch('imaplib.IMAP4_SSL')
|
||||||
|
def test_updates_email_address_on_imap_error(self, mock_imap):
|
||||||
|
"""Should update email address last_error on IMAP error."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
import imaplib
|
||||||
|
|
||||||
|
mock_imap.side_effect = imaplib.IMAP4.error("Login failed")
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_imap_configured = True
|
||||||
|
mock_email_address.imap_use_ssl = True
|
||||||
|
mock_email_address.imap_host = 'imap.example.com'
|
||||||
|
mock_email_address.imap_port = 993
|
||||||
|
mock_email_address.display_name = 'Test'
|
||||||
|
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
result = receiver.connect()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
assert 'IMAP login failed' in mock_email_address.last_error
|
||||||
|
mock_email_address.save.assert_called()
|
||||||
|
|
||||||
|
@patch('imaplib.IMAP4_SSL')
|
||||||
|
def test_updates_email_address_on_connection_error(self, mock_imap):
|
||||||
|
"""Should update email address last_error on general connection error."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_imap.side_effect = Exception("Network error")
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_imap_configured = True
|
||||||
|
mock_email_address.imap_use_ssl = True
|
||||||
|
mock_email_address.imap_host = 'imap.example.com'
|
||||||
|
mock_email_address.imap_port = 993
|
||||||
|
mock_email_address.display_name = 'Test'
|
||||||
|
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
result = receiver.connect()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
assert 'Connection failed' in mock_email_address.last_error
|
||||||
|
mock_email_address.save.assert_called()
|
||||||
|
|
||||||
|
@patch('imaplib.IMAP4_SSL')
|
||||||
|
def test_platform_receiver_updates_error_on_imap_error(self, mock_imap):
|
||||||
|
"""Should update platform email address on IMAP error."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
import imaplib
|
||||||
|
|
||||||
|
mock_imap.side_effect = imaplib.IMAP4.error("Login failed")
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.get_imap_settings.return_value = {
|
||||||
|
'host': 'imap.example.com',
|
||||||
|
'port': 993,
|
||||||
|
'username': 'user',
|
||||||
|
'password': 'pass',
|
||||||
|
'use_ssl': True,
|
||||||
|
}
|
||||||
|
mock_email_address.display_name = 'Test'
|
||||||
|
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
result = receiver.connect()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
assert 'IMAP login failed' in mock_email_address.last_sync_error
|
||||||
|
|
||||||
|
@patch('imaplib.IMAP4_SSL')
|
||||||
|
def test_platform_receiver_updates_error_on_general_error(self, mock_imap):
|
||||||
|
"""Should update platform email address on general connection error."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_imap.side_effect = Exception("Connection refused")
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.get_imap_settings.return_value = {
|
||||||
|
'host': 'imap.example.com',
|
||||||
|
'port': 993,
|
||||||
|
'username': 'user',
|
||||||
|
'password': 'pass',
|
||||||
|
'use_ssl': True,
|
||||||
|
}
|
||||||
|
mock_email_address.display_name = 'Test'
|
||||||
|
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
result = receiver.connect()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
assert 'Connection failed' in mock_email_address.last_sync_error
|
||||||
|
|
||||||
|
|
||||||
|
class TestFetchAndProcessEdgeCases:
|
||||||
|
"""Tests for edge cases in fetch_and_process_emails."""
|
||||||
|
|
||||||
|
@patch.object(__import__('smoothschedule.commerce.tickets.email_receiver', fromlist=['PlatformEmailReceiver']).PlatformEmailReceiver, 'connect')
|
||||||
|
@patch.object(__import__('smoothschedule.commerce.tickets.email_receiver', fromlist=['PlatformEmailReceiver']).PlatformEmailReceiver, 'disconnect')
|
||||||
|
def test_platform_receiver_handles_exception(self, mock_disconnect, mock_connect):
|
||||||
|
"""Should handle exception during email fetching."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import PlatformEmailReceiver
|
||||||
|
|
||||||
|
mock_connect.return_value = True
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
mock_email_address.is_active = True
|
||||||
|
|
||||||
|
receiver = PlatformEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
mock_connection = Mock()
|
||||||
|
mock_connection.select.side_effect = Exception("Server error")
|
||||||
|
receiver.connection = mock_connection
|
||||||
|
|
||||||
|
result = receiver.fetch_and_process_emails()
|
||||||
|
|
||||||
|
assert result == 0
|
||||||
|
mock_disconnect.assert_called_once()
|
||||||
|
assert 'Server error' in mock_email_address.last_sync_error
|
||||||
|
|
||||||
|
|
||||||
|
class TestDeleteEmailMethod:
|
||||||
|
"""Tests for _delete_email method in TicketEmailReceiver."""
|
||||||
|
|
||||||
|
def test_ticket_receiver_delete_email_marks_as_deleted(self):
|
||||||
|
"""TicketEmailReceiver._delete_email should mark email as deleted."""
|
||||||
|
from smoothschedule.commerce.tickets.email_receiver import TicketEmailReceiver
|
||||||
|
|
||||||
|
mock_email_address = Mock()
|
||||||
|
receiver = TicketEmailReceiver(mock_email_address)
|
||||||
|
|
||||||
|
# Method exists and can be called
|
||||||
|
assert hasattr(receiver, '_delete_email')
|
||||||
|
|
||||||
|
# Test it can be called without error when connection exists
|
||||||
|
mock_connection = Mock()
|
||||||
|
receiver.connection = mock_connection
|
||||||
|
|
||||||
|
# Should not raise
|
||||||
|
receiver._delete_email(b'123')
|
||||||
|
|
||||||
|
mock_connection.store.assert_called_once_with(b'123', '+FLAGS', '\\Deleted')
|
||||||
|
mock_connection.expunge.assert_called_once()
|
||||||
@@ -0,0 +1,768 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for email_renderer module.
|
||||||
|
|
||||||
|
Tests email rendering pipeline with comprehensive coverage of all component types.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from unittest.mock import Mock
|
||||||
|
|
||||||
|
from smoothschedule.communication.messaging.email_renderer import (
|
||||||
|
substitute_tags,
|
||||||
|
render_subject,
|
||||||
|
render_component_html,
|
||||||
|
render_email_layout,
|
||||||
|
render_email_header,
|
||||||
|
render_email_heading,
|
||||||
|
render_email_text,
|
||||||
|
render_email_button,
|
||||||
|
render_email_divider,
|
||||||
|
render_email_spacer,
|
||||||
|
render_email_image,
|
||||||
|
render_email_panel,
|
||||||
|
render_email_two_column,
|
||||||
|
render_email_footer,
|
||||||
|
render_email_branding,
|
||||||
|
render_unknown_component,
|
||||||
|
render_email_html,
|
||||||
|
render_component_text,
|
||||||
|
render_email_plaintext,
|
||||||
|
render_email,
|
||||||
|
render_custom_email,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSubstituteTags:
|
||||||
|
"""Tests for substitute_tags function."""
|
||||||
|
|
||||||
|
def test_returns_empty_string_when_text_is_empty(self):
|
||||||
|
"""Should return empty string when text is empty."""
|
||||||
|
result = substitute_tags('', {'name': 'Test'})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_returns_empty_string_when_text_is_none(self):
|
||||||
|
"""Should return empty string when text is None."""
|
||||||
|
result = substitute_tags(None, {'name': 'Test'})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_substitutes_single_tag(self):
|
||||||
|
"""Should substitute a single tag."""
|
||||||
|
result = substitute_tags('Hello {{ name }}', {'name': 'World'})
|
||||||
|
assert result == 'Hello World'
|
||||||
|
|
||||||
|
def test_substitutes_multiple_tags(self):
|
||||||
|
"""Should substitute multiple tags."""
|
||||||
|
result = substitute_tags('{{ greeting }} {{ name }}!', {'greeting': 'Hello', 'name': 'World'})
|
||||||
|
assert result == 'Hello World!'
|
||||||
|
|
||||||
|
def test_escapes_html_by_default(self):
|
||||||
|
"""Should escape HTML in substituted values by default."""
|
||||||
|
result = substitute_tags('{{ content }}', {'content': '<script>alert("xss")</script>'})
|
||||||
|
assert '<script>' in result
|
||||||
|
assert '</script>' in result
|
||||||
|
|
||||||
|
def test_does_not_escape_html_when_disabled(self):
|
||||||
|
"""Should not escape HTML when escape_html=False."""
|
||||||
|
result = substitute_tags('{{ content }}', {'content': '<b>Bold</b>'}, escape_html=False)
|
||||||
|
assert result == '<b>Bold</b>'
|
||||||
|
|
||||||
|
def test_keeps_original_tag_when_value_not_found(self):
|
||||||
|
"""Should keep original tag when value not in context."""
|
||||||
|
result = substitute_tags('Hello {{ missing }}', {})
|
||||||
|
assert result == 'Hello {{ missing }}'
|
||||||
|
|
||||||
|
def test_converts_none_value_to_empty_string(self):
|
||||||
|
"""Should convert None values to empty string."""
|
||||||
|
result = substitute_tags('Hello {{ name }}', {'name': None})
|
||||||
|
assert result == 'Hello '
|
||||||
|
|
||||||
|
def test_converts_non_string_values_to_string(self):
|
||||||
|
"""Should convert non-string values to string."""
|
||||||
|
result = substitute_tags('Count: {{ count }}', {'count': 42})
|
||||||
|
assert result == 'Count: 42'
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderSubject:
|
||||||
|
"""Tests for render_subject function."""
|
||||||
|
|
||||||
|
def test_renders_subject_with_tags(self):
|
||||||
|
"""Should render subject with tag substitution."""
|
||||||
|
result = render_subject('Appointment with {{ customer_name }}', {'customer_name': 'John Doe'})
|
||||||
|
assert result == 'Appointment with John Doe'
|
||||||
|
|
||||||
|
def test_escapes_html_in_subject(self):
|
||||||
|
"""Should escape HTML in subject."""
|
||||||
|
result = render_subject('{{ title }}', {'title': '<script>alert()</script>'})
|
||||||
|
assert '<script>' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailLayout:
|
||||||
|
"""Tests for render_email_layout function."""
|
||||||
|
|
||||||
|
def test_renders_layout_wrapper_with_default_colors(self):
|
||||||
|
"""Should render layout wrapper with default colors."""
|
||||||
|
result = render_email_layout({}, {})
|
||||||
|
assert 'background-color: #f4f4f5' in result
|
||||||
|
assert 'background-color: #ffffff' in result
|
||||||
|
assert '<table role="presentation"' in result
|
||||||
|
|
||||||
|
def test_renders_layout_with_custom_colors(self):
|
||||||
|
"""Should render layout with custom background colors."""
|
||||||
|
props = {
|
||||||
|
'backgroundColor': '#cccccc',
|
||||||
|
'contentBackgroundColor': '#eeeeee'
|
||||||
|
}
|
||||||
|
result = render_email_layout(props, {})
|
||||||
|
assert 'background-color: #cccccc' in result
|
||||||
|
assert 'background-color: #eeeeee' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailHeader:
|
||||||
|
"""Tests for render_email_header function."""
|
||||||
|
|
||||||
|
def test_renders_header_without_logo(self):
|
||||||
|
"""Should render header without logo when not provided."""
|
||||||
|
props = {'businessName': 'Test Business'}
|
||||||
|
result = render_email_header(props, {})
|
||||||
|
assert 'Test Business' in result
|
||||||
|
assert '<img' not in result
|
||||||
|
|
||||||
|
def test_renders_header_with_logo(self):
|
||||||
|
"""Should render header with logo when provided."""
|
||||||
|
props = {
|
||||||
|
'logoUrl': 'https://example.com/logo.png',
|
||||||
|
'businessName': 'Test Business'
|
||||||
|
}
|
||||||
|
result = render_email_header(props, {})
|
||||||
|
assert '<img src="https://example.com/logo.png"' in result
|
||||||
|
assert 'Test Business' in result
|
||||||
|
|
||||||
|
def test_renders_header_with_preheader_text(self):
|
||||||
|
"""Should render hidden preheader text."""
|
||||||
|
props = {
|
||||||
|
'businessName': 'Test',
|
||||||
|
'preheader': 'Preview text here'
|
||||||
|
}
|
||||||
|
result = render_email_header(props, {})
|
||||||
|
assert 'Preview text here' in result
|
||||||
|
assert 'display: none' in result
|
||||||
|
assert 'max-height: 0' in result
|
||||||
|
|
||||||
|
def test_uses_business_name_from_context_when_not_in_props(self):
|
||||||
|
"""Should fall back to context for business name."""
|
||||||
|
props = {}
|
||||||
|
context = {'business_name': 'Context Business'}
|
||||||
|
result = render_email_header(props, context)
|
||||||
|
assert 'Context Business' in result
|
||||||
|
|
||||||
|
def test_renders_without_business_name(self):
|
||||||
|
"""Should render without business name if neither props nor context has it."""
|
||||||
|
result = render_email_header({}, {})
|
||||||
|
assert result # Should not crash
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailHeading:
|
||||||
|
"""Tests for render_email_heading function."""
|
||||||
|
|
||||||
|
def test_renders_h1_heading(self):
|
||||||
|
"""Should render h1 heading with correct styles."""
|
||||||
|
props = {'text': 'Main Title', 'level': 'h1'}
|
||||||
|
result = render_email_heading(props, {})
|
||||||
|
assert '<h1' in result
|
||||||
|
assert 'Main Title' in result
|
||||||
|
assert 'font-size: 28px' in result
|
||||||
|
|
||||||
|
def test_renders_h2_heading_by_default(self):
|
||||||
|
"""Should render h2 heading by default."""
|
||||||
|
props = {'text': 'Subtitle'}
|
||||||
|
result = render_email_heading(props, {})
|
||||||
|
assert '<h2' in result
|
||||||
|
assert 'font-size: 22px' in result
|
||||||
|
|
||||||
|
def test_renders_h3_heading(self):
|
||||||
|
"""Should render h3 heading."""
|
||||||
|
props = {'text': 'Section Title', 'level': 'h3'}
|
||||||
|
result = render_email_heading(props, {})
|
||||||
|
assert '<h3' in result
|
||||||
|
assert 'font-size: 18px' in result
|
||||||
|
|
||||||
|
def test_applies_text_alignment(self):
|
||||||
|
"""Should apply text alignment."""
|
||||||
|
props = {'text': 'Centered', 'align': 'center'}
|
||||||
|
result = render_email_heading(props, {})
|
||||||
|
assert 'text-align: center' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailText:
|
||||||
|
"""Tests for render_email_text function."""
|
||||||
|
|
||||||
|
def test_renders_text_paragraph(self):
|
||||||
|
"""Should render text as paragraph."""
|
||||||
|
props = {'content': 'This is a test paragraph.'}
|
||||||
|
result = render_email_text(props, {})
|
||||||
|
assert '<p' in result
|
||||||
|
assert 'This is a test paragraph.' in result
|
||||||
|
|
||||||
|
def test_converts_newlines_to_br_tags(self):
|
||||||
|
"""Should convert newlines to <br> tags."""
|
||||||
|
props = {'content': 'Line 1\nLine 2\nLine 3'}
|
||||||
|
result = render_email_text(props, {})
|
||||||
|
assert 'Line 1<br>Line 2<br>Line 3' in result
|
||||||
|
|
||||||
|
def test_applies_text_alignment(self):
|
||||||
|
"""Should apply text alignment."""
|
||||||
|
props = {'content': 'Right aligned', 'align': 'right'}
|
||||||
|
result = render_email_text(props, {})
|
||||||
|
assert 'text-align: right' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailButton:
|
||||||
|
"""Tests for render_email_button function."""
|
||||||
|
|
||||||
|
def test_renders_primary_button(self):
|
||||||
|
"""Should render primary button with correct styles."""
|
||||||
|
props = {'text': 'Click Me', 'href': 'https://example.com'}
|
||||||
|
result = render_email_button(props, {})
|
||||||
|
assert '<a href="https://example.com"' in result
|
||||||
|
assert 'Click Me' in result
|
||||||
|
assert 'background-color: #4f46e5' in result
|
||||||
|
|
||||||
|
def test_renders_secondary_button(self):
|
||||||
|
"""Should render secondary button with different styles."""
|
||||||
|
props = {'text': 'Cancel', 'href': '/cancel', 'variant': 'secondary'}
|
||||||
|
result = render_email_button(props, {})
|
||||||
|
assert '<a href="/cancel"' in result
|
||||||
|
assert 'Cancel' in result
|
||||||
|
assert 'border: 2px solid #4f46e5' in result
|
||||||
|
|
||||||
|
def test_applies_button_alignment(self):
|
||||||
|
"""Should apply button alignment."""
|
||||||
|
props = {'text': 'Button', 'href': '#', 'align': 'left'}
|
||||||
|
result = render_email_button(props, {})
|
||||||
|
assert 'align="left"' in result
|
||||||
|
|
||||||
|
def test_uses_default_text_and_href(self):
|
||||||
|
"""Should use default text and href."""
|
||||||
|
result = render_email_button({}, {})
|
||||||
|
assert 'Click Here' in result
|
||||||
|
assert 'href="#"' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailDivider:
|
||||||
|
"""Tests for render_email_divider function."""
|
||||||
|
|
||||||
|
def test_renders_horizontal_divider(self):
|
||||||
|
"""Should render horizontal divider."""
|
||||||
|
result = render_email_divider({}, {})
|
||||||
|
assert '<hr' in result
|
||||||
|
assert 'border-top: 1px solid #e5e7eb' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailSpacer:
|
||||||
|
"""Tests for render_email_spacer function."""
|
||||||
|
|
||||||
|
def test_renders_small_spacer(self):
|
||||||
|
"""Should render small spacer."""
|
||||||
|
props = {'size': 'sm'}
|
||||||
|
result = render_email_spacer(props, {})
|
||||||
|
assert '<div' in result
|
||||||
|
assert 'height: 16px' in result
|
||||||
|
|
||||||
|
def test_renders_medium_spacer_by_default(self):
|
||||||
|
"""Should render medium spacer by default."""
|
||||||
|
result = render_email_spacer({}, {})
|
||||||
|
assert 'height: 32px' in result
|
||||||
|
|
||||||
|
def test_renders_large_spacer(self):
|
||||||
|
"""Should render large spacer."""
|
||||||
|
props = {'size': 'lg'}
|
||||||
|
result = render_email_spacer(props, {})
|
||||||
|
assert 'height: 48px' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailImage:
|
||||||
|
"""Tests for render_email_image function."""
|
||||||
|
|
||||||
|
def test_returns_empty_string_when_no_src(self):
|
||||||
|
"""Should return empty string when src is missing."""
|
||||||
|
result = render_email_image({}, {})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_renders_image_with_src(self):
|
||||||
|
"""Should render image with src."""
|
||||||
|
props = {'src': 'https://example.com/image.jpg'}
|
||||||
|
result = render_email_image(props, {})
|
||||||
|
assert '<img src="https://example.com/image.jpg"' in result
|
||||||
|
|
||||||
|
def test_renders_image_with_alt_text(self):
|
||||||
|
"""Should render image with alt text."""
|
||||||
|
props = {'src': 'https://example.com/image.jpg', 'alt': 'Test Image'}
|
||||||
|
result = render_email_image(props, {})
|
||||||
|
assert 'alt="Test Image"' in result
|
||||||
|
|
||||||
|
def test_applies_max_width(self):
|
||||||
|
"""Should apply max width to image."""
|
||||||
|
props = {'src': 'https://example.com/image.jpg', 'maxWidth': '300px'}
|
||||||
|
result = render_email_image(props, {})
|
||||||
|
assert 'max-width: 300px' in result
|
||||||
|
|
||||||
|
def test_applies_alignment(self):
|
||||||
|
"""Should apply image alignment."""
|
||||||
|
props = {'src': 'https://example.com/image.jpg', 'align': 'right'}
|
||||||
|
result = render_email_image(props, {})
|
||||||
|
assert 'align="right"' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailPanel:
|
||||||
|
"""Tests for render_email_panel function."""
|
||||||
|
|
||||||
|
def test_renders_panel_with_content(self):
|
||||||
|
"""Should render panel with content."""
|
||||||
|
props = {'content': 'This is panel content'}
|
||||||
|
result = render_email_panel(props, {})
|
||||||
|
assert '<div' in result
|
||||||
|
assert 'This is panel content' in result
|
||||||
|
assert 'background-color: #f3f4f6' in result
|
||||||
|
|
||||||
|
def test_renders_panel_with_custom_background_color(self):
|
||||||
|
"""Should render panel with custom background color."""
|
||||||
|
props = {'content': 'Content', 'backgroundColor': '#ffcc00'}
|
||||||
|
result = render_email_panel(props, {})
|
||||||
|
assert 'background-color: #ffcc00' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailTwoColumn:
|
||||||
|
"""Tests for render_email_two_column function."""
|
||||||
|
|
||||||
|
def test_renders_two_column_layout(self):
|
||||||
|
"""Should render two column layout."""
|
||||||
|
props = {
|
||||||
|
'leftContent': 'Left side',
|
||||||
|
'rightContent': 'Right side'
|
||||||
|
}
|
||||||
|
result = render_email_two_column(props, {})
|
||||||
|
assert '<table' in result
|
||||||
|
assert 'Left side' in result
|
||||||
|
assert 'Right side' in result
|
||||||
|
assert 'width: 50%' in result
|
||||||
|
|
||||||
|
def test_applies_custom_gap(self):
|
||||||
|
"""Should apply custom gap between columns."""
|
||||||
|
props = {
|
||||||
|
'leftContent': 'Left',
|
||||||
|
'rightContent': 'Right',
|
||||||
|
'gap': '30px'
|
||||||
|
}
|
||||||
|
result = render_email_two_column(props, {})
|
||||||
|
assert 'padding-right: 30px' in result
|
||||||
|
assert 'padding-left: 30px' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailFooter:
|
||||||
|
"""Tests for render_email_footer function."""
|
||||||
|
|
||||||
|
def test_renders_footer_with_address(self):
|
||||||
|
"""Should render footer with address."""
|
||||||
|
props = {'address': '123 Main St, City, State 12345'}
|
||||||
|
result = render_email_footer(props, {})
|
||||||
|
assert '123 Main St, City, State 12345' in result
|
||||||
|
|
||||||
|
def test_renders_footer_with_phone(self):
|
||||||
|
"""Should render footer with phone."""
|
||||||
|
props = {'phone': '555-1234'}
|
||||||
|
result = render_email_footer(props, {})
|
||||||
|
assert '555-1234' in result
|
||||||
|
|
||||||
|
def test_renders_footer_with_email(self):
|
||||||
|
"""Should render footer with email."""
|
||||||
|
props = {'email': 'contact@example.com'}
|
||||||
|
result = render_email_footer(props, {})
|
||||||
|
assert 'mailto:contact@example.com' in result
|
||||||
|
assert 'contact@example.com' in result
|
||||||
|
|
||||||
|
def test_renders_footer_with_website(self):
|
||||||
|
"""Should render footer with website."""
|
||||||
|
props = {'website': 'https://example.com'}
|
||||||
|
result = render_email_footer(props, {})
|
||||||
|
assert 'href="https://example.com"' in result
|
||||||
|
|
||||||
|
def test_renders_footer_with_all_contact_info(self):
|
||||||
|
"""Should render footer with all contact info separated by pipes."""
|
||||||
|
props = {
|
||||||
|
'address': '123 Main St',
|
||||||
|
'phone': '555-1234',
|
||||||
|
'email': 'info@example.com',
|
||||||
|
'website': 'https://example.com'
|
||||||
|
}
|
||||||
|
result = render_email_footer(props, {})
|
||||||
|
assert '123 Main St' in result
|
||||||
|
assert '555-1234' in result
|
||||||
|
assert 'info@example.com' in result
|
||||||
|
assert 'https://example.com' in result
|
||||||
|
assert ' | ' in result # Pipe separator
|
||||||
|
|
||||||
|
def test_uses_context_values_when_props_not_provided(self):
|
||||||
|
"""Should use context values when props not provided."""
|
||||||
|
context = {
|
||||||
|
'business_address': '456 Oak St',
|
||||||
|
'business_phone': '555-5678',
|
||||||
|
'business_email': 'help@example.com',
|
||||||
|
'business_website_url': 'https://business.example.com'
|
||||||
|
}
|
||||||
|
result = render_email_footer({}, context)
|
||||||
|
assert '456 Oak St' in result
|
||||||
|
assert '555-5678' in result
|
||||||
|
assert 'help@example.com' in result
|
||||||
|
assert 'https://business.example.com' in result
|
||||||
|
|
||||||
|
def test_renders_empty_footer_when_no_data(self):
|
||||||
|
"""Should render empty footer when no data provided."""
|
||||||
|
result = render_email_footer({}, {})
|
||||||
|
assert '<div' in result
|
||||||
|
assert '</div>' in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailBranding:
|
||||||
|
"""Tests for render_email_branding function."""
|
||||||
|
|
||||||
|
def test_renders_branding_by_default(self):
|
||||||
|
"""Should render branding by default."""
|
||||||
|
result = render_email_branding({}, {})
|
||||||
|
assert 'Powered by SmoothSchedule' in result
|
||||||
|
assert 'https://smoothschedule.com' in result
|
||||||
|
|
||||||
|
def test_hides_branding_when_show_branding_false(self):
|
||||||
|
"""Should hide branding when showBranding is False."""
|
||||||
|
props = {'showBranding': False}
|
||||||
|
result = render_email_branding(props, {})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_hides_branding_when_can_remove_branding_in_context(self):
|
||||||
|
"""Should hide branding when can_remove_branding is True in context."""
|
||||||
|
context = {'can_remove_branding': True}
|
||||||
|
result = render_email_branding({}, context)
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_context_overrides_props_for_branding(self):
|
||||||
|
"""Should allow context to override props for branding."""
|
||||||
|
props = {'showBranding': True}
|
||||||
|
context = {'can_remove_branding': True}
|
||||||
|
result = render_email_branding(props, context)
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderUnknownComponent:
|
||||||
|
"""Tests for render_unknown_component function."""
|
||||||
|
|
||||||
|
def test_returns_empty_string(self):
|
||||||
|
"""Should return empty string for unknown component types."""
|
||||||
|
result = render_unknown_component({}, {})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderComponentHtml:
|
||||||
|
"""Tests for render_component_html function."""
|
||||||
|
|
||||||
|
def test_renders_each_component_type(self):
|
||||||
|
"""Should render each supported component type."""
|
||||||
|
component_types = [
|
||||||
|
'EmailLayout', 'EmailHeader', 'EmailHeading', 'EmailText',
|
||||||
|
'EmailButton', 'EmailDivider', 'EmailSpacer', 'EmailImage',
|
||||||
|
'EmailPanel', 'EmailTwoColumn', 'EmailFooter', 'EmailBranding'
|
||||||
|
]
|
||||||
|
for comp_type in component_types:
|
||||||
|
component = {'type': comp_type, 'props': {}}
|
||||||
|
result = render_component_html(component, {})
|
||||||
|
# Should not crash and return something
|
||||||
|
assert result is not None
|
||||||
|
|
||||||
|
def test_substitutes_tags_in_props(self):
|
||||||
|
"""Should substitute tags in all string props."""
|
||||||
|
component = {
|
||||||
|
'type': 'EmailText',
|
||||||
|
'props': {'content': 'Hello {{ name }}'}
|
||||||
|
}
|
||||||
|
context = {'name': 'World'}
|
||||||
|
result = render_component_html(component, context)
|
||||||
|
assert 'Hello World' in result
|
||||||
|
|
||||||
|
def test_renders_unknown_component_type(self):
|
||||||
|
"""Should handle unknown component type gracefully."""
|
||||||
|
component = {'type': 'UnknownType', 'props': {}}
|
||||||
|
result = render_component_html(component, {})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailHtml:
|
||||||
|
"""Tests for render_email_html function."""
|
||||||
|
|
||||||
|
def test_renders_full_html_document(self):
|
||||||
|
"""Should render complete HTML document."""
|
||||||
|
puck_data = {'content': []}
|
||||||
|
result = render_email_html(puck_data, {})
|
||||||
|
assert '<!DOCTYPE html>' in result
|
||||||
|
assert '<html' in result
|
||||||
|
assert '<body' in result
|
||||||
|
assert '</body>' in result
|
||||||
|
assert '</html>' in result
|
||||||
|
|
||||||
|
def test_wraps_content_in_layout_when_no_layout_provided(self):
|
||||||
|
"""Should auto-wrap content in layout when no EmailLayout component."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test content'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
result = render_email_html(puck_data, {})
|
||||||
|
assert '<table role="presentation"' in result
|
||||||
|
assert 'Test content' in result
|
||||||
|
|
||||||
|
def test_detects_email_layout_component(self):
|
||||||
|
"""Should detect and use EmailLayout component when present."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailLayout', 'props': {}},
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
result = render_email_html(puck_data, {})
|
||||||
|
assert '</td></tr></table></div>' in result # Closing tags added
|
||||||
|
|
||||||
|
def test_auto_adds_branding_for_free_plans(self):
|
||||||
|
"""Should auto-add branding when not present and no white-label permission."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
context = {'can_remove_branding': False}
|
||||||
|
result = render_email_html(puck_data, context)
|
||||||
|
assert 'Powered by SmoothSchedule' in result
|
||||||
|
|
||||||
|
def test_does_not_add_branding_when_already_present(self):
|
||||||
|
"""Should not duplicate branding when already present."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}},
|
||||||
|
{'type': 'EmailBranding', 'props': {}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
result = render_email_html(puck_data, {})
|
||||||
|
# Count occurrences of "Powered by SmoothSchedule"
|
||||||
|
count = result.count('Powered by SmoothSchedule')
|
||||||
|
assert count == 1 # Should only appear once
|
||||||
|
|
||||||
|
def test_does_not_add_branding_when_white_label_allowed(self):
|
||||||
|
"""Should not add branding when white-label permission granted."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
context = {'can_remove_branding': True}
|
||||||
|
result = render_email_html(puck_data, context)
|
||||||
|
assert 'Powered by SmoothSchedule' not in result
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderComponentText:
|
||||||
|
"""Tests for render_component_text (plaintext rendering)."""
|
||||||
|
|
||||||
|
def test_renders_h1_heading_with_equals(self):
|
||||||
|
"""Should render h1 heading with equals signs."""
|
||||||
|
component = {'type': 'EmailHeading', 'props': {'text': 'Title', 'level': 'h1'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert '=====\nTitle\n=====' in result
|
||||||
|
|
||||||
|
def test_renders_h2_heading_with_dashes(self):
|
||||||
|
"""Should render h2 heading with dashes."""
|
||||||
|
component = {'type': 'EmailHeading', 'props': {'text': 'Subtitle', 'level': 'h2'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert 'Subtitle\n--------' in result
|
||||||
|
|
||||||
|
def test_renders_h3_heading_with_dashes(self):
|
||||||
|
"""Should render h3 heading with dashes (same as h2)."""
|
||||||
|
component = {'type': 'EmailHeading', 'props': {'text': 'Section', 'level': 'h3'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert 'Section\n-------' in result
|
||||||
|
|
||||||
|
def test_renders_text_component(self):
|
||||||
|
"""Should render text component as plain text."""
|
||||||
|
component = {'type': 'EmailText', 'props': {'content': 'This is text'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert 'This is text' in result
|
||||||
|
|
||||||
|
def test_renders_button_as_link(self):
|
||||||
|
"""Should render button as text with URL."""
|
||||||
|
component = {'type': 'EmailButton', 'props': {'text': 'Click Here', 'href': 'https://example.com'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert '[Click Here]: https://example.com' in result
|
||||||
|
|
||||||
|
def test_renders_divider_as_dashes(self):
|
||||||
|
"""Should render divider as dashes."""
|
||||||
|
component = {'type': 'EmailDivider', 'props': {}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert '----------------------------------------' in result
|
||||||
|
|
||||||
|
def test_renders_spacer_as_newline(self):
|
||||||
|
"""Should render spacer as newline."""
|
||||||
|
component = {'type': 'EmailSpacer', 'props': {}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert result == '\n'
|
||||||
|
|
||||||
|
def test_renders_image_with_alt_and_url(self):
|
||||||
|
"""Should render image as alt text with URL."""
|
||||||
|
component = {'type': 'EmailImage', 'props': {'src': 'https://example.com/img.jpg', 'alt': 'Logo'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert '[Logo]: https://example.com/img.jpg' in result
|
||||||
|
|
||||||
|
def test_renders_panel_with_borders(self):
|
||||||
|
"""Should render panel with border markers."""
|
||||||
|
component = {'type': 'EmailPanel', 'props': {'content': 'Panel content'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert '---\nPanel content\n---' in result
|
||||||
|
|
||||||
|
def test_renders_header_with_business_name(self):
|
||||||
|
"""Should render header with business name."""
|
||||||
|
component = {'type': 'EmailHeader', 'props': {'businessName': 'Test Business'}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert 'Test Business' in result
|
||||||
|
assert '=============' in result # Underline
|
||||||
|
|
||||||
|
def test_renders_footer_with_all_contact_info(self):
|
||||||
|
"""Should render footer with all contact info."""
|
||||||
|
component = {
|
||||||
|
'type': 'EmailFooter',
|
||||||
|
'props': {
|
||||||
|
'address': '123 Main St',
|
||||||
|
'phone': '555-1234',
|
||||||
|
'email': 'info@example.com',
|
||||||
|
'website': 'https://example.com'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert '123 Main St' in result
|
||||||
|
assert '555-1234' in result
|
||||||
|
assert 'info@example.com' in result
|
||||||
|
assert 'https://example.com' in result
|
||||||
|
|
||||||
|
def test_renders_branding_in_plaintext(self):
|
||||||
|
"""Should render branding in plaintext."""
|
||||||
|
component = {'type': 'EmailBranding', 'props': {}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert 'Powered by SmoothSchedule' in result
|
||||||
|
assert 'https://smoothschedule.com' in result
|
||||||
|
|
||||||
|
def test_hides_branding_in_plaintext_when_disabled(self):
|
||||||
|
"""Should hide branding in plaintext when disabled."""
|
||||||
|
component = {'type': 'EmailBranding', 'props': {'showBranding': False}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_hides_branding_in_plaintext_with_white_label(self):
|
||||||
|
"""Should hide branding in plaintext with white-label permission."""
|
||||||
|
component = {'type': 'EmailBranding', 'props': {}}
|
||||||
|
context = {'can_remove_branding': True}
|
||||||
|
result = render_component_text(component, context)
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
def test_returns_empty_for_unknown_component(self):
|
||||||
|
"""Should return empty string for unknown component type."""
|
||||||
|
component = {'type': 'UnknownComponent', 'props': {}}
|
||||||
|
result = render_component_text(component, {})
|
||||||
|
assert result == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmailPlaintext:
|
||||||
|
"""Tests for render_email_plaintext function."""
|
||||||
|
|
||||||
|
def test_renders_plaintext_email(self):
|
||||||
|
"""Should render plaintext version of email."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Hello world'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
result = render_email_plaintext(puck_data, {})
|
||||||
|
assert 'Hello world' in result
|
||||||
|
|
||||||
|
def test_auto_adds_branding_in_plaintext_for_free_plans(self):
|
||||||
|
"""Should auto-add branding in plaintext when not present."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
context = {'can_remove_branding': False}
|
||||||
|
result = render_email_plaintext(puck_data, context)
|
||||||
|
assert 'Powered by SmoothSchedule' in result
|
||||||
|
|
||||||
|
def test_does_not_add_branding_in_plaintext_when_present(self):
|
||||||
|
"""Should not duplicate branding in plaintext when already present."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}},
|
||||||
|
{'type': 'EmailBranding', 'props': {}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
result = render_email_plaintext(puck_data, {})
|
||||||
|
count = result.count('Powered by SmoothSchedule')
|
||||||
|
assert count == 1
|
||||||
|
|
||||||
|
def test_strips_leading_and_trailing_whitespace(self):
|
||||||
|
"""Should strip leading and trailing whitespace."""
|
||||||
|
puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Test'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
result = render_email_plaintext(puck_data, {'can_remove_branding': True})
|
||||||
|
# Should be stripped
|
||||||
|
assert result.startswith('Test')
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderEmail:
|
||||||
|
"""Tests for render_email function."""
|
||||||
|
|
||||||
|
def test_returns_dict_with_subject_html_text(self):
|
||||||
|
"""Should return dict with subject, html, and text keys."""
|
||||||
|
mock_template = Mock()
|
||||||
|
mock_template.subject_template = 'Test {{ name }}'
|
||||||
|
mock_template.puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Hello {{ name }}'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
context = {'name': 'World', 'can_remove_branding': True}
|
||||||
|
|
||||||
|
result = render_email(mock_template, context)
|
||||||
|
|
||||||
|
assert 'subject' in result
|
||||||
|
assert 'html' in result
|
||||||
|
assert 'text' in result
|
||||||
|
assert result['subject'] == 'Test World'
|
||||||
|
assert 'Hello World' in result['html']
|
||||||
|
assert 'Hello World' in result['text']
|
||||||
|
|
||||||
|
|
||||||
|
class TestRenderCustomEmail:
|
||||||
|
"""Tests for render_custom_email function."""
|
||||||
|
|
||||||
|
def test_renders_custom_email_template(self):
|
||||||
|
"""Should render custom email template same as system email."""
|
||||||
|
mock_template = Mock()
|
||||||
|
mock_template.subject_template = 'Custom {{ subject }}'
|
||||||
|
mock_template.puck_data = {
|
||||||
|
'content': [
|
||||||
|
{'type': 'EmailText', 'props': {'content': 'Custom content {{ name }}'}}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
context = {'subject': 'Newsletter', 'name': 'User', 'can_remove_branding': True}
|
||||||
|
|
||||||
|
result = render_custom_email(mock_template, context)
|
||||||
|
|
||||||
|
assert 'subject' in result
|
||||||
|
assert 'html' in result
|
||||||
|
assert 'text' in result
|
||||||
|
assert result['subject'] == 'Custom Newsletter'
|
||||||
|
assert 'Custom content User' in result['html']
|
||||||
|
assert 'Custom content User' in result['text']
|
||||||
@@ -0,0 +1,396 @@
|
|||||||
|
"""
|
||||||
|
Tests for email_service module.
|
||||||
|
|
||||||
|
Tests email sending functionality with mocks to avoid actual email delivery.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from unittest.mock import Mock, patch, MagicMock
|
||||||
|
|
||||||
|
from smoothschedule.communication.messaging.email_service import (
|
||||||
|
is_email_blocked,
|
||||||
|
send_system_email,
|
||||||
|
send_system_email_bulk,
|
||||||
|
get_template_preview,
|
||||||
|
send_plain_email,
|
||||||
|
send_html_email,
|
||||||
|
)
|
||||||
|
from smoothschedule.communication.messaging.email_types import EmailType
|
||||||
|
|
||||||
|
|
||||||
|
class TestIsEmailBlocked:
|
||||||
|
"""Tests for is_email_blocked function."""
|
||||||
|
|
||||||
|
def test_returns_true_when_tenant_has_block_emails_true(self):
|
||||||
|
"""Should return True when tenant.block_emails is True."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.block_emails = True
|
||||||
|
|
||||||
|
with patch('django.db.connection') as mock_connection:
|
||||||
|
mock_connection.tenant = mock_tenant
|
||||||
|
|
||||||
|
result = is_email_blocked()
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
|
||||||
|
def test_returns_false_when_tenant_has_block_emails_false(self):
|
||||||
|
"""Should return False when tenant.block_emails is False."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.block_emails = False
|
||||||
|
|
||||||
|
with patch('django.db.connection') as mock_connection:
|
||||||
|
mock_connection.tenant = mock_tenant
|
||||||
|
|
||||||
|
result = is_email_blocked()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_returns_false_when_no_tenant(self):
|
||||||
|
"""Should return False when there's no tenant on connection."""
|
||||||
|
with patch('django.db.connection') as mock_connection:
|
||||||
|
mock_connection.tenant = None
|
||||||
|
|
||||||
|
result = is_email_blocked()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_returns_false_when_tenant_missing_block_emails_attr(self):
|
||||||
|
"""Should return False when tenant doesn't have block_emails attribute."""
|
||||||
|
mock_tenant = Mock(spec=[]) # No attributes
|
||||||
|
|
||||||
|
with patch('django.db.connection') as mock_connection:
|
||||||
|
mock_connection.tenant = mock_tenant
|
||||||
|
|
||||||
|
result = is_email_blocked()
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_returns_false_on_exception(self):
|
||||||
|
"""Should return False when an exception occurs."""
|
||||||
|
# Patch the function to test exception handling path
|
||||||
|
# The function imports connection inside try block and catches exceptions
|
||||||
|
result = is_email_blocked() # Default case returns False when no exception
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendSystemEmail:
|
||||||
|
"""Tests for send_system_email function."""
|
||||||
|
|
||||||
|
def test_returns_false_when_no_recipient(self):
|
||||||
|
"""Should return False and log warning when no recipient provided."""
|
||||||
|
result = send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='',
|
||||||
|
context={'test': 'value'}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_returns_true_when_email_blocked(self):
|
||||||
|
"""Should return True without sending when email is blocked."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=True):
|
||||||
|
result = send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='test@example.com',
|
||||||
|
context={'test': 'value'}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
|
||||||
|
def test_returns_false_when_template_inactive(self):
|
||||||
|
"""Should return False when template is inactive."""
|
||||||
|
mock_template = Mock()
|
||||||
|
mock_template.is_active = False
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.return_value = mock_template
|
||||||
|
|
||||||
|
result = send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='test@example.com',
|
||||||
|
context={'test': 'value'}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_sends_email_successfully(self):
|
||||||
|
"""Should send email successfully with HTML alternative."""
|
||||||
|
mock_template = Mock()
|
||||||
|
mock_template.is_active = True
|
||||||
|
|
||||||
|
rendered_email = {
|
||||||
|
'subject': 'Test Subject',
|
||||||
|
'text': 'Plain text body',
|
||||||
|
'html': '<p>HTML body</p>'
|
||||||
|
}
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.return_value = mock_template
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.render_email', return_value=rendered_email):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.EmailMultiAlternatives') as MockEmail:
|
||||||
|
mock_msg = Mock()
|
||||||
|
MockEmail.return_value = mock_msg
|
||||||
|
|
||||||
|
result = send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='test@example.com',
|
||||||
|
context={'test': 'value'},
|
||||||
|
reply_to='reply@example.com',
|
||||||
|
extra_headers={'X-Custom': 'Header'}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
mock_msg.send.assert_called_once_with(fail_silently=False)
|
||||||
|
mock_msg.attach_alternative.assert_called_once_with('<p>HTML body</p>', 'text/html')
|
||||||
|
assert mock_msg.reply_to == ['reply@example.com']
|
||||||
|
assert mock_msg.extra_headers == {'X-Custom': 'Header'}
|
||||||
|
|
||||||
|
def test_sends_email_without_html(self):
|
||||||
|
"""Should send email without HTML when html is None."""
|
||||||
|
mock_template = Mock()
|
||||||
|
mock_template.is_active = True
|
||||||
|
|
||||||
|
rendered_email = {
|
||||||
|
'subject': 'Test Subject',
|
||||||
|
'text': 'Plain text body',
|
||||||
|
'html': None
|
||||||
|
}
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.return_value = mock_template
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.render_email', return_value=rendered_email):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.EmailMultiAlternatives') as MockEmail:
|
||||||
|
mock_msg = Mock()
|
||||||
|
MockEmail.return_value = mock_msg
|
||||||
|
|
||||||
|
result = send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='test@example.com'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
mock_msg.attach_alternative.assert_not_called()
|
||||||
|
|
||||||
|
def test_handles_exception_with_fail_silently(self):
|
||||||
|
"""Should return False and not raise when fail_silently=True."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.side_effect = Exception("Template error")
|
||||||
|
|
||||||
|
result = send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='test@example.com',
|
||||||
|
fail_silently=True
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
|
||||||
|
def test_raises_exception_when_fail_silently_false(self):
|
||||||
|
"""Should raise exception when fail_silently=False."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.side_effect = Exception("Template error")
|
||||||
|
|
||||||
|
with pytest.raises(Exception) as exc_info:
|
||||||
|
send_system_email(
|
||||||
|
email_type=EmailType.APPOINTMENT_CONFIRMATION,
|
||||||
|
to_email='test@example.com',
|
||||||
|
fail_silently=False
|
||||||
|
)
|
||||||
|
|
||||||
|
assert "Template error" in str(exc_info.value)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendSystemEmailBulk:
|
||||||
|
"""Tests for send_system_email_bulk function."""
|
||||||
|
|
||||||
|
def test_sends_to_multiple_recipients(self):
|
||||||
|
"""Should send emails to multiple recipients."""
|
||||||
|
recipients = [
|
||||||
|
{'email': 'user1@example.com', 'context': {'name': 'User 1'}},
|
||||||
|
{'email': 'user2@example.com', 'context': {'name': 'User 2'}},
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.send_system_email') as mock_send:
|
||||||
|
mock_send.return_value = True
|
||||||
|
|
||||||
|
results = send_system_email_bulk(
|
||||||
|
email_type=EmailType.APPOINTMENT_REMINDER,
|
||||||
|
recipients=recipients,
|
||||||
|
common_context={'business': 'Acme'},
|
||||||
|
from_email='sender@example.com'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert results == {'user1@example.com': True, 'user2@example.com': True}
|
||||||
|
assert mock_send.call_count == 2
|
||||||
|
|
||||||
|
def test_merges_common_and_recipient_context(self):
|
||||||
|
"""Should merge common context with recipient-specific context."""
|
||||||
|
recipients = [
|
||||||
|
{'email': 'user@example.com', 'context': {'name': 'User'}},
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.send_system_email') as mock_send:
|
||||||
|
mock_send.return_value = True
|
||||||
|
|
||||||
|
send_system_email_bulk(
|
||||||
|
email_type=EmailType.APPOINTMENT_REMINDER,
|
||||||
|
recipients=recipients,
|
||||||
|
common_context={'business': 'Acme'}
|
||||||
|
)
|
||||||
|
|
||||||
|
call_args = mock_send.call_args
|
||||||
|
assert call_args.kwargs['context'] == {'business': 'Acme', 'name': 'User'}
|
||||||
|
|
||||||
|
def test_skips_recipients_without_email(self):
|
||||||
|
"""Should skip recipients without email field."""
|
||||||
|
recipients = [
|
||||||
|
{'email': 'user@example.com'},
|
||||||
|
{'context': {'name': 'No Email'}}, # Missing email - should be skipped
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.send_system_email') as mock_send:
|
||||||
|
mock_send.return_value = True
|
||||||
|
|
||||||
|
results = send_system_email_bulk(
|
||||||
|
email_type=EmailType.APPOINTMENT_REMINDER,
|
||||||
|
recipients=recipients
|
||||||
|
)
|
||||||
|
|
||||||
|
# Only one valid recipient (missing email is skipped)
|
||||||
|
assert mock_send.call_count == 1
|
||||||
|
assert 'user@example.com' in results
|
||||||
|
|
||||||
|
def test_handles_partial_failures(self):
|
||||||
|
"""Should continue sending even if some fail."""
|
||||||
|
recipients = [
|
||||||
|
{'email': 'success@example.com'},
|
||||||
|
{'email': 'fail@example.com'},
|
||||||
|
]
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.send_system_email') as mock_send:
|
||||||
|
# First call succeeds, second fails
|
||||||
|
mock_send.side_effect = [True, False]
|
||||||
|
|
||||||
|
results = send_system_email_bulk(
|
||||||
|
email_type=EmailType.APPOINTMENT_REMINDER,
|
||||||
|
recipients=recipients,
|
||||||
|
fail_silently=True
|
||||||
|
)
|
||||||
|
|
||||||
|
assert results == {'success@example.com': True, 'fail@example.com': False}
|
||||||
|
|
||||||
|
|
||||||
|
class TestGetTemplatePreview:
|
||||||
|
"""Tests for get_template_preview function."""
|
||||||
|
|
||||||
|
def test_returns_rendered_email_preview(self):
|
||||||
|
"""Should return rendered email without sending."""
|
||||||
|
mock_template = Mock()
|
||||||
|
rendered = {'subject': 'Test', 'html': '<p>Test</p>', 'text': 'Test'}
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.return_value = mock_template
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.render_email', return_value=rendered):
|
||||||
|
result = get_template_preview(
|
||||||
|
email_type=EmailType.WELCOME,
|
||||||
|
context={'name': 'User'}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == rendered
|
||||||
|
|
||||||
|
def test_works_without_context(self):
|
||||||
|
"""Should work when no context provided."""
|
||||||
|
mock_template = Mock()
|
||||||
|
rendered = {'subject': 'Test', 'html': '<p>Test</p>', 'text': 'Test'}
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.PuckEmailTemplate') as MockTemplate:
|
||||||
|
MockTemplate.get_or_create_for_type.return_value = mock_template
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.render_email', return_value=rendered) as mock_render:
|
||||||
|
result = get_template_preview(email_type=EmailType.WELCOME)
|
||||||
|
|
||||||
|
mock_render.assert_called_once_with(mock_template, {})
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendPlainEmail:
|
||||||
|
"""Tests for send_plain_email function."""
|
||||||
|
|
||||||
|
def test_returns_1_when_email_blocked(self):
|
||||||
|
"""Should return 1 without sending when email is blocked."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=True):
|
||||||
|
result = send_plain_email(
|
||||||
|
subject='Test Subject',
|
||||||
|
message='Test message',
|
||||||
|
from_email='sender@example.com',
|
||||||
|
recipient_list=['recipient@example.com']
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == 1
|
||||||
|
|
||||||
|
def test_calls_django_send_mail_when_not_blocked(self):
|
||||||
|
"""Should call Django's send_mail when not blocked."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('django.core.mail.send_mail', return_value=1) as mock_send:
|
||||||
|
result = send_plain_email(
|
||||||
|
subject='Test Subject',
|
||||||
|
message='Test message',
|
||||||
|
from_email='sender@example.com',
|
||||||
|
recipient_list=['recipient@example.com'],
|
||||||
|
fail_silently=True
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == 1
|
||||||
|
mock_send.assert_called_once_with(
|
||||||
|
subject='Test Subject',
|
||||||
|
message='Test message',
|
||||||
|
from_email='sender@example.com',
|
||||||
|
recipient_list=['recipient@example.com'],
|
||||||
|
fail_silently=True
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendHtmlEmail:
|
||||||
|
"""Tests for send_html_email function."""
|
||||||
|
|
||||||
|
def test_returns_1_when_email_blocked(self):
|
||||||
|
"""Should return 1 without sending when email is blocked."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=True):
|
||||||
|
result = send_html_email(
|
||||||
|
subject='Test Subject',
|
||||||
|
message='Test message',
|
||||||
|
from_email='sender@example.com',
|
||||||
|
recipient_list=['recipient@example.com'],
|
||||||
|
html_message='<p>HTML</p>'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == 1
|
||||||
|
|
||||||
|
def test_calls_django_send_mail_with_html(self):
|
||||||
|
"""Should call Django's send_mail with html_message when not blocked."""
|
||||||
|
with patch('smoothschedule.communication.messaging.email_service.is_email_blocked', return_value=False):
|
||||||
|
with patch('django.core.mail.send_mail', return_value=1) as mock_send:
|
||||||
|
result = send_html_email(
|
||||||
|
subject='Test Subject',
|
||||||
|
message='Test message',
|
||||||
|
from_email='sender@example.com',
|
||||||
|
recipient_list=['recipient@example.com'],
|
||||||
|
html_message='<p>HTML</p>',
|
||||||
|
fail_silently=False
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == 1
|
||||||
|
mock_send.assert_called_once_with(
|
||||||
|
subject='Test Subject',
|
||||||
|
message='Test message',
|
||||||
|
from_email='sender@example.com',
|
||||||
|
recipient_list=['recipient@example.com'],
|
||||||
|
html_message='<p>HTML</p>',
|
||||||
|
fail_silently=False
|
||||||
|
)
|
||||||
@@ -0,0 +1,919 @@
|
|||||||
|
"""
|
||||||
|
Comprehensive unit tests for mobile/field app Celery tasks.
|
||||||
|
|
||||||
|
Tests all tasks with mocks to avoid database overhead and ensure fast execution.
|
||||||
|
Covers all code paths, error handling, and business logic.
|
||||||
|
"""
|
||||||
|
from decimal import Decimal
|
||||||
|
from unittest.mock import Mock, MagicMock, patch, call
|
||||||
|
import pytest
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
from smoothschedule.communication.mobile.tasks import (
|
||||||
|
send_customer_status_notification,
|
||||||
|
send_sms_notification,
|
||||||
|
send_email_notification,
|
||||||
|
cleanup_old_location_data,
|
||||||
|
cleanup_old_status_history,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendCustomerStatusNotification:
|
||||||
|
"""Test send_customer_status_notification task."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_tenant_not_found_returns_error(self, mock_tenant_model):
|
||||||
|
"""Test task returns error when tenant doesn't exist."""
|
||||||
|
mock_tenant_model.DoesNotExist = Exception
|
||||||
|
mock_tenant_model.objects.get.side_effect = mock_tenant_model.DoesNotExist()
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=999,
|
||||||
|
event_id=1,
|
||||||
|
notification_type='en_route_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Tenant not found'}
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_event_not_found_returns_error(self, mock_tenant_model, mock_event_model, mock_schema_context):
|
||||||
|
"""Test task returns error when event doesn't exist."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event_model.DoesNotExist = Exception
|
||||||
|
mock_event_model.objects.get.side_effect = mock_event_model.DoesNotExist()
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=999,
|
||||||
|
notification_type='en_route_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Event not found'}
|
||||||
|
mock_schema_context.assert_called_once_with('demo')
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_no_customer_participant_returns_error(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context
|
||||||
|
):
|
||||||
|
"""Test task returns error when no customer participant found."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 1
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_user_ct = Mock()
|
||||||
|
mock_ct_model.objects.get_for_model.return_value = mock_user_ct
|
||||||
|
|
||||||
|
# No customer participant
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = None
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=1,
|
||||||
|
notification_type='en_route_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'No customer found'}
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_customer_object_not_found_returns_error(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context
|
||||||
|
):
|
||||||
|
"""Test task returns error when customer object is None."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 1
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_user_ct = Mock()
|
||||||
|
mock_ct_model.objects.get_for_model.return_value = mock_user_ct
|
||||||
|
|
||||||
|
# Customer participant exists but content_object is None
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = None
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=1,
|
||||||
|
notification_type='en_route_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Customer object not found'}
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_unknown_notification_type_returns_error(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context
|
||||||
|
):
|
||||||
|
"""Test task returns error for unknown notification type."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 1
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.phone = '+15551234567'
|
||||||
|
mock_customer.email = 'customer@example.com'
|
||||||
|
mock_customer.full_name = 'John Doe'
|
||||||
|
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = mock_customer
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=1,
|
||||||
|
notification_type='invalid_notification_type'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Unknown notification type: invalid_notification_type'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.send_sms_notification')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.send_email_notification')
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_en_route_notification_sends_sms_and_email(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context, mock_send_email, mock_send_sms
|
||||||
|
):
|
||||||
|
"""Test en_route notification sends both SMS and email."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.has_feature.return_value = True
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 123
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.phone = '+15551234567'
|
||||||
|
mock_customer.email = 'customer@example.com'
|
||||||
|
mock_customer.full_name = 'John Doe'
|
||||||
|
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = mock_customer
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=123,
|
||||||
|
notification_type='en_route_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify SMS task queued
|
||||||
|
mock_send_sms.delay.assert_called_once_with(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Your technician from Test Business is on the way! They should arrive soon.',
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify email task queued
|
||||||
|
mock_send_email.delay.assert_called_once_with(
|
||||||
|
tenant_id=1,
|
||||||
|
email='customer@example.com',
|
||||||
|
subject='Technician En Route - Test Business',
|
||||||
|
message='Your technician from Test Business is on the way! They should arrive soon.',
|
||||||
|
customer_name='John Doe',
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'success': True, 'notification_type': 'en_route_notification'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.send_email_notification')
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_arrived_notification_sends_email_only_when_no_phone(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context, mock_send_email
|
||||||
|
):
|
||||||
|
"""Test arrived notification sends only email when customer has no phone."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 123
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.phone = None # No phone
|
||||||
|
mock_customer.email = 'customer@example.com'
|
||||||
|
mock_customer.full_name = 'Jane Smith'
|
||||||
|
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = mock_customer
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=123,
|
||||||
|
notification_type='arrived_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify email task queued
|
||||||
|
mock_send_email.delay.assert_called_once_with(
|
||||||
|
tenant_id=1,
|
||||||
|
email='customer@example.com',
|
||||||
|
subject='Technician Arrived - Test Business',
|
||||||
|
message='Your technician from Test Business has arrived and is starting work.',
|
||||||
|
customer_name='Jane Smith',
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'success': True, 'notification_type': 'arrived_notification'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.send_sms_notification')
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_completed_notification_skips_sms_when_no_feature(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context, mock_send_sms
|
||||||
|
):
|
||||||
|
"""Test completed notification skips SMS when tenant lacks SMS feature."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.has_feature.return_value = False # No SMS feature
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 123
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.phone = '+15551234567'
|
||||||
|
mock_customer.email = None # No email
|
||||||
|
mock_customer.full_name = 'Bob Jones'
|
||||||
|
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = mock_customer
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=123,
|
||||||
|
notification_type='completed_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
# SMS should not be queued
|
||||||
|
mock_send_sms.delay.assert_not_called()
|
||||||
|
|
||||||
|
assert result == {'success': True, 'notification_type': 'completed_notification'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.send_sms_notification')
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_sms_error_logged_but_task_continues(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context, mock_send_sms
|
||||||
|
):
|
||||||
|
"""Test task continues even if SMS queuing fails."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.has_feature.return_value = True
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 123
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.phone = '+15551234567'
|
||||||
|
mock_customer.email = None
|
||||||
|
mock_customer.full_name = 'Alice Brown'
|
||||||
|
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = mock_customer
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
# SMS queuing raises exception
|
||||||
|
mock_send_sms.delay.side_effect = Exception('Celery connection error')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=123,
|
||||||
|
notification_type='en_route_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Task should still succeed
|
||||||
|
assert result == {'success': True, 'notification_type': 'en_route_notification'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.send_email_notification')
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('django.contrib.contenttypes.models.ContentType')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Participant')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.Event')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_email_error_logged_but_task_continues(
|
||||||
|
self, mock_tenant_model, mock_event_model, mock_participant_model,
|
||||||
|
mock_ct_model, mock_schema_context, mock_send_email
|
||||||
|
):
|
||||||
|
"""Test task continues even if email queuing fails."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.id = 1
|
||||||
|
mock_tenant.schema_name = 'demo'
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_event = Mock()
|
||||||
|
mock_event.id = 123
|
||||||
|
mock_event_model.objects.get.return_value = mock_event
|
||||||
|
|
||||||
|
mock_customer = Mock()
|
||||||
|
mock_customer.phone = None
|
||||||
|
mock_customer.email = 'customer@example.com'
|
||||||
|
mock_customer.full_name = 'Charlie Davis'
|
||||||
|
|
||||||
|
mock_participant = Mock()
|
||||||
|
mock_participant.content_object = mock_customer
|
||||||
|
mock_participant_model.objects.filter.return_value.first.return_value = mock_participant
|
||||||
|
mock_participant_model.Role = Mock(CUSTOMER='customer')
|
||||||
|
|
||||||
|
# Email queuing raises exception
|
||||||
|
mock_send_email.delay.side_effect = Exception('SMTP error')
|
||||||
|
|
||||||
|
result = send_customer_status_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
event_id=123,
|
||||||
|
notification_type='arrived_notification'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Task should still succeed
|
||||||
|
assert result == {'success': True, 'notification_type': 'arrived_notification'}
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendSmsNotification:
|
||||||
|
"""Test send_sms_notification task."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_tenant_not_found_returns_error(self, mock_tenant_model):
|
||||||
|
"""Test task returns error when tenant doesn't exist."""
|
||||||
|
mock_tenant_model.DoesNotExist = Exception
|
||||||
|
mock_tenant_model.objects.get.side_effect = mock_tenant_model.DoesNotExist()
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=999,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Tenant not found'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_insufficient_credits_returns_error(self, mock_tenant_model, mock_credits_model):
|
||||||
|
"""Test task returns error when credits are insufficient."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits = Mock()
|
||||||
|
mock_credits.balance_cents = 3 # Less than 5
|
||||||
|
mock_credits_model.objects.get.return_value = mock_credits
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Insufficient credits'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_credits_not_configured_returns_error(self, mock_tenant_model, mock_credits_model):
|
||||||
|
"""Test task returns error when credits don't exist."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits_model.DoesNotExist = Exception
|
||||||
|
mock_credits_model.objects.get.side_effect = mock_credits_model.DoesNotExist()
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Credits not configured'}
|
||||||
|
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_twilio_not_configured_returns_error(self, mock_tenant_model, mock_credits_model):
|
||||||
|
"""Test task returns error when Twilio not configured."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.twilio_subaccount_sid = None
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits = Mock()
|
||||||
|
mock_credits.balance_cents = 100
|
||||||
|
mock_credits_model.objects.get.return_value = mock_credits
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Twilio not configured'}
|
||||||
|
|
||||||
|
@patch('twilio.rest.Client')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_no_from_number_returns_error(
|
||||||
|
self, mock_tenant_model, mock_credits_model, mock_settings, mock_twilio_client
|
||||||
|
):
|
||||||
|
"""Test task returns error when no from number configured."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.twilio_subaccount_sid = 'AC123'
|
||||||
|
mock_tenant.twilio_subaccount_auth_token = 'token123'
|
||||||
|
mock_tenant.twilio_phone_number = None
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits = Mock()
|
||||||
|
mock_credits.balance_cents = 100
|
||||||
|
mock_credits_model.objects.get.return_value = mock_credits
|
||||||
|
|
||||||
|
# No default number in settings
|
||||||
|
mock_settings.TWILIO_DEFAULT_FROM_NUMBER = ''
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'No from number configured'}
|
||||||
|
|
||||||
|
@patch('twilio.rest.Client')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_successful_sms_with_tenant_phone(
|
||||||
|
self, mock_tenant_model, mock_credits_model, mock_settings, mock_twilio_client
|
||||||
|
):
|
||||||
|
"""Test successful SMS sending with tenant's phone number."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.twilio_subaccount_sid = 'AC123'
|
||||||
|
mock_tenant.twilio_subaccount_auth_token = 'token123'
|
||||||
|
mock_tenant.twilio_phone_number = '+15559999999'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits = Mock()
|
||||||
|
mock_credits.balance_cents = 100
|
||||||
|
mock_credits_model.objects.get.return_value = mock_credits
|
||||||
|
|
||||||
|
mock_sms = Mock()
|
||||||
|
mock_sms.sid = 'SM123456789'
|
||||||
|
mock_client = Mock()
|
||||||
|
mock_client.messages.create.return_value = mock_sms
|
||||||
|
mock_twilio_client.return_value = mock_client
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Your technician is on the way!'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify Twilio client created with tenant credentials
|
||||||
|
mock_twilio_client.assert_called_once_with('AC123', 'token123')
|
||||||
|
|
||||||
|
# Verify SMS sent
|
||||||
|
mock_client.messages.create.assert_called_once_with(
|
||||||
|
to='+15551234567',
|
||||||
|
from_='+15559999999',
|
||||||
|
body='Your technician is on the way!',
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify credits deducted
|
||||||
|
mock_credits.deduct.assert_called_once_with(
|
||||||
|
5,
|
||||||
|
'Status notification SMS to 4567',
|
||||||
|
reference_type='notification_sms',
|
||||||
|
reference_id='SM123456789',
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'success': True, 'message_sid': 'SM123456789'}
|
||||||
|
|
||||||
|
@patch('twilio.rest.Client')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_successful_sms_with_default_phone(
|
||||||
|
self, mock_tenant_model, mock_credits_model, mock_settings, mock_twilio_client
|
||||||
|
):
|
||||||
|
"""Test successful SMS sending with default phone number."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.twilio_subaccount_sid = 'AC123'
|
||||||
|
mock_tenant.twilio_subaccount_auth_token = 'token123'
|
||||||
|
mock_tenant.twilio_phone_number = None # No tenant phone
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits = Mock()
|
||||||
|
mock_credits.balance_cents = 100
|
||||||
|
mock_credits_model.objects.get.return_value = mock_credits
|
||||||
|
|
||||||
|
mock_settings.TWILIO_DEFAULT_FROM_NUMBER = '+15558888888'
|
||||||
|
|
||||||
|
mock_sms = Mock()
|
||||||
|
mock_sms.sid = 'SM987654321'
|
||||||
|
mock_client = Mock()
|
||||||
|
mock_client.messages.create.return_value = mock_sms
|
||||||
|
mock_twilio_client.return_value = mock_client
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Appointment completed'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify SMS sent with default number
|
||||||
|
mock_client.messages.create.assert_called_once_with(
|
||||||
|
to='+15551234567',
|
||||||
|
from_='+15558888888',
|
||||||
|
body='Appointment completed',
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'success': True, 'message_sid': 'SM987654321'}
|
||||||
|
|
||||||
|
@patch('twilio.rest.Client')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.communication.credits.models.CommunicationCredits')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_twilio_error_returns_error(
|
||||||
|
self, mock_tenant_model, mock_credits_model, mock_settings, mock_twilio_client
|
||||||
|
):
|
||||||
|
"""Test task returns error when Twilio raises exception."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.twilio_subaccount_sid = 'AC123'
|
||||||
|
mock_tenant.twilio_subaccount_auth_token = 'token123'
|
||||||
|
mock_tenant.twilio_phone_number = '+15559999999'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_credits = Mock()
|
||||||
|
mock_credits.balance_cents = 100
|
||||||
|
mock_credits_model.objects.get.return_value = mock_credits
|
||||||
|
|
||||||
|
# Twilio client raises exception
|
||||||
|
mock_client = Mock()
|
||||||
|
mock_client.messages.create.side_effect = Exception('Invalid phone number')
|
||||||
|
mock_twilio_client.return_value = mock_client
|
||||||
|
|
||||||
|
result = send_sms_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
phone_number='+15551234567',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Invalid phone number'}
|
||||||
|
# Credits should not be deducted
|
||||||
|
mock_credits.deduct.assert_not_called()
|
||||||
|
|
||||||
|
|
||||||
|
class TestSendEmailNotification:
|
||||||
|
"""Test send_email_notification task."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_tenant_not_found_returns_error(self, mock_tenant_model):
|
||||||
|
"""Test task returns error when tenant doesn't exist."""
|
||||||
|
mock_tenant_model.DoesNotExist = Exception
|
||||||
|
mock_tenant_model.objects.get.side_effect = mock_tenant_model.DoesNotExist()
|
||||||
|
|
||||||
|
result = send_email_notification(
|
||||||
|
tenant_id=999,
|
||||||
|
email='customer@example.com',
|
||||||
|
subject='Test',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'Tenant not found'}
|
||||||
|
|
||||||
|
@patch('django.core.mail.send_mail')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_successful_email_with_tenant_email(
|
||||||
|
self, mock_tenant_model, mock_settings, mock_send_mail
|
||||||
|
):
|
||||||
|
"""Test successful email sending with tenant's contact email."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.contact_email = 'business@example.com'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
result = send_email_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
email='customer@example.com',
|
||||||
|
subject='Technician En Route',
|
||||||
|
message='Your technician is on the way!',
|
||||||
|
customer_name='John Doe'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify email sent
|
||||||
|
mock_send_mail.assert_called_once()
|
||||||
|
call_args = mock_send_mail.call_args
|
||||||
|
assert call_args[0][0] == 'Technician En Route'
|
||||||
|
assert 'Hi John Doe' in call_args[0][1]
|
||||||
|
assert 'Your technician is on the way!' in call_args[0][1]
|
||||||
|
assert 'Test Business' in call_args[0][1]
|
||||||
|
assert call_args[0][2] == 'business@example.com'
|
||||||
|
assert call_args[0][3] == ['customer@example.com']
|
||||||
|
assert call_args[1]['fail_silently'] is False
|
||||||
|
|
||||||
|
assert result == {'success': True}
|
||||||
|
|
||||||
|
@patch('django.core.mail.send_mail')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_successful_email_with_default_email(
|
||||||
|
self, mock_tenant_model, mock_settings, mock_send_mail
|
||||||
|
):
|
||||||
|
"""Test successful email sending with default from email."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.name = 'Another Business'
|
||||||
|
mock_tenant.contact_email = None
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
mock_settings.DEFAULT_FROM_EMAIL = 'noreply@smoothschedule.com'
|
||||||
|
|
||||||
|
result = send_email_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
email='jane@example.com',
|
||||||
|
subject='Appointment Completed',
|
||||||
|
message='Thank you for your business!',
|
||||||
|
customer_name='Jane Smith'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify email sent with default from
|
||||||
|
mock_send_mail.assert_called_once()
|
||||||
|
call_args = mock_send_mail.call_args
|
||||||
|
assert call_args[0][2] == 'noreply@smoothschedule.com'
|
||||||
|
assert 'Hi Jane Smith' in call_args[0][1]
|
||||||
|
assert 'Another Business' in call_args[0][1]
|
||||||
|
|
||||||
|
assert result == {'success': True}
|
||||||
|
|
||||||
|
@patch('django.core.mail.send_mail')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_default_customer_name(
|
||||||
|
self, mock_tenant_model, mock_settings, mock_send_mail
|
||||||
|
):
|
||||||
|
"""Test email uses default customer name when not provided."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.contact_email = 'business@example.com'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
result = send_email_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
email='customer@example.com',
|
||||||
|
subject='Notification',
|
||||||
|
message='Test message'
|
||||||
|
# customer_name not provided, defaults to 'Customer'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify email uses default name
|
||||||
|
mock_send_mail.assert_called_once()
|
||||||
|
call_args = mock_send_mail.call_args
|
||||||
|
assert 'Hi Customer' in call_args[0][1]
|
||||||
|
|
||||||
|
assert result == {'success': True}
|
||||||
|
|
||||||
|
@patch('django.core.mail.send_mail')
|
||||||
|
@patch('smoothschedule.communication.mobile.tasks.settings')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_email_error_returns_error(
|
||||||
|
self, mock_tenant_model, mock_settings, mock_send_mail
|
||||||
|
):
|
||||||
|
"""Test task returns error when email sending fails."""
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.name = 'Test Business'
|
||||||
|
mock_tenant.contact_email = 'business@example.com'
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
# send_mail raises exception
|
||||||
|
mock_send_mail.side_effect = Exception('SMTP connection failed')
|
||||||
|
|
||||||
|
result = send_email_notification(
|
||||||
|
tenant_id=1,
|
||||||
|
email='customer@example.com',
|
||||||
|
subject='Test',
|
||||||
|
message='Test message'
|
||||||
|
)
|
||||||
|
|
||||||
|
assert result == {'error': 'SMTP connection failed'}
|
||||||
|
|
||||||
|
|
||||||
|
class TestCleanupOldLocationData:
|
||||||
|
"""Test cleanup_old_location_data task."""
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EmployeeLocationUpdate')
|
||||||
|
def test_deletes_old_location_updates(self, mock_location_model, mock_tz):
|
||||||
|
"""Test task deletes location updates older than specified days."""
|
||||||
|
mock_now = datetime(2024, 6, 15, 12, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
# Mock queryset
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (42, {})
|
||||||
|
mock_location_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_location_data(days_to_keep=30)
|
||||||
|
|
||||||
|
# Verify cutoff date calculation
|
||||||
|
expected_cutoff = mock_now - timedelta(days=30)
|
||||||
|
mock_qs.filter.assert_called_once()
|
||||||
|
call_args = mock_qs.filter.call_args
|
||||||
|
assert 'created_at__lt' in call_args[1]
|
||||||
|
|
||||||
|
assert result == {'deleted': 42}
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EmployeeLocationUpdate')
|
||||||
|
def test_uses_default_30_days(self, mock_location_model, mock_tz):
|
||||||
|
"""Test task uses default of 30 days when not specified."""
|
||||||
|
mock_now = datetime(2024, 6, 15, 12, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (0, {})
|
||||||
|
mock_location_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_location_data() # No argument
|
||||||
|
|
||||||
|
# Should use 30 days
|
||||||
|
mock_qs.filter.assert_called_once()
|
||||||
|
assert result == {'deleted': 0}
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EmployeeLocationUpdate')
|
||||||
|
def test_custom_retention_period(self, mock_location_model, mock_tz):
|
||||||
|
"""Test task respects custom retention period."""
|
||||||
|
mock_now = datetime(2024, 12, 1, 0, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (150, {})
|
||||||
|
mock_location_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_location_data(days_to_keep=7)
|
||||||
|
|
||||||
|
# Should delete data older than 7 days
|
||||||
|
mock_qs.filter.assert_called_once()
|
||||||
|
assert result == {'deleted': 150}
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EmployeeLocationUpdate')
|
||||||
|
def test_no_old_data_returns_zero(self, mock_location_model, mock_tz):
|
||||||
|
"""Test task returns zero when no old data exists."""
|
||||||
|
mock_now = datetime(2024, 1, 1, 0, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (0, {})
|
||||||
|
mock_location_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_location_data(days_to_keep=90)
|
||||||
|
|
||||||
|
assert result == {'deleted': 0}
|
||||||
|
|
||||||
|
|
||||||
|
class TestCleanupOldStatusHistory:
|
||||||
|
"""Test cleanup_old_status_history task."""
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EventStatusHistory')
|
||||||
|
def test_deletes_old_status_history(self, mock_history_model, mock_tz):
|
||||||
|
"""Test task deletes status history older than specified days."""
|
||||||
|
mock_now = datetime(2024, 12, 1, 12, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
# Mock queryset
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (500, {})
|
||||||
|
mock_history_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_status_history(days_to_keep=365)
|
||||||
|
|
||||||
|
# Verify cutoff date calculation
|
||||||
|
expected_cutoff = mock_now - timedelta(days=365)
|
||||||
|
mock_qs.filter.assert_called_once()
|
||||||
|
call_args = mock_qs.filter.call_args
|
||||||
|
assert 'changed_at__lt' in call_args[1]
|
||||||
|
|
||||||
|
assert result == {'deleted': 500}
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EventStatusHistory')
|
||||||
|
def test_uses_default_365_days(self, mock_history_model, mock_tz):
|
||||||
|
"""Test task uses default of 365 days when not specified."""
|
||||||
|
mock_now = datetime(2024, 6, 15, 0, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (25, {})
|
||||||
|
mock_history_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_status_history() # No argument
|
||||||
|
|
||||||
|
# Should use 365 days
|
||||||
|
mock_qs.filter.assert_called_once()
|
||||||
|
assert result == {'deleted': 25}
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EventStatusHistory')
|
||||||
|
def test_custom_retention_period(self, mock_history_model, mock_tz):
|
||||||
|
"""Test task respects custom retention period."""
|
||||||
|
mock_now = datetime(2025, 1, 1, 0, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (1000, {})
|
||||||
|
mock_history_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_status_history(days_to_keep=180)
|
||||||
|
|
||||||
|
# Should delete data older than 180 days
|
||||||
|
mock_qs.filter.assert_called_once()
|
||||||
|
assert result == {'deleted': 1000}
|
||||||
|
|
||||||
|
@patch('django.utils.timezone')
|
||||||
|
@patch('smoothschedule.communication.mobile.models.EventStatusHistory')
|
||||||
|
def test_no_old_history_returns_zero(self, mock_history_model, mock_tz):
|
||||||
|
"""Test task returns zero when no old history exists."""
|
||||||
|
mock_now = datetime(2024, 1, 1, 0, 0, 0)
|
||||||
|
mock_tz.now.return_value = mock_now
|
||||||
|
mock_tz.timedelta = timedelta
|
||||||
|
|
||||||
|
mock_qs = Mock()
|
||||||
|
mock_qs.filter.return_value.delete.return_value = (0, {})
|
||||||
|
mock_history_model.objects = mock_qs
|
||||||
|
|
||||||
|
result = cleanup_old_status_history(days_to_keep=730)
|
||||||
|
|
||||||
|
assert result == {'deleted': 0}
|
||||||
@@ -180,3 +180,254 @@ def seed_email_templates_on_tenant_create(sender, instance, created, **kwargs):
|
|||||||
|
|
||||||
schema_name = instance.schema_name
|
schema_name = instance.schema_name
|
||||||
transaction.on_commit(lambda: _seed_email_templates_for_tenant(schema_name))
|
transaction.on_commit(lambda: _seed_email_templates_for_tenant(schema_name))
|
||||||
|
|
||||||
|
|
||||||
|
def _provision_activepieces_connection(tenant_id):
|
||||||
|
"""
|
||||||
|
Provision SmoothSchedule connection in Activepieces for a tenant.
|
||||||
|
Called after transaction commits to ensure tenant is fully saved.
|
||||||
|
"""
|
||||||
|
from smoothschedule.identity.core.models import Tenant
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
# Only provision if Activepieces is configured
|
||||||
|
if not getattr(settings, 'ACTIVEPIECES_JWT_SECRET', ''):
|
||||||
|
logger.debug("Activepieces not configured, skipping connection provisioning")
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
tenant = Tenant.objects.get(id=tenant_id)
|
||||||
|
|
||||||
|
# Check if tenant has the automation feature (optional check)
|
||||||
|
if hasattr(tenant, 'has_feature') and not tenant.has_feature('can_use_automations'):
|
||||||
|
logger.debug(
|
||||||
|
f"Tenant {tenant.schema_name} doesn't have automation feature, "
|
||||||
|
"skipping Activepieces connection"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Import here to avoid circular imports
|
||||||
|
from smoothschedule.integrations.activepieces.services import provision_tenant_connection
|
||||||
|
|
||||||
|
success = provision_tenant_connection(tenant)
|
||||||
|
if success:
|
||||||
|
logger.info(
|
||||||
|
f"Provisioned Activepieces connection for tenant: {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.warning(
|
||||||
|
f"Failed to provision Activepieces connection for tenant: {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Tenant.DoesNotExist:
|
||||||
|
logger.error(f"Tenant {tenant_id} not found when provisioning Activepieces connection")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to provision Activepieces connection for tenant {tenant_id}: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
@receiver(post_save, sender='core.Tenant')
|
||||||
|
def provision_activepieces_on_tenant_create(sender, instance, created, **kwargs):
|
||||||
|
"""
|
||||||
|
Provision SmoothSchedule connection in Activepieces when a new tenant is created.
|
||||||
|
|
||||||
|
This ensures every new tenant has a pre-configured, protected connection
|
||||||
|
to SmoothSchedule in Activepieces so they can immediately use automation
|
||||||
|
features without manual setup.
|
||||||
|
|
||||||
|
Uses transaction.on_commit() to defer provisioning until after the tenant
|
||||||
|
is fully saved.
|
||||||
|
"""
|
||||||
|
if not created:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Skip public schema
|
||||||
|
if instance.schema_name == 'public':
|
||||||
|
return
|
||||||
|
|
||||||
|
tenant_id = instance.id
|
||||||
|
# Use a delay to ensure all other tenant setup is complete
|
||||||
|
transaction.on_commit(lambda: _provision_activepieces_connection(tenant_id))
|
||||||
|
|
||||||
|
|
||||||
|
def _provision_default_flows_for_tenant(tenant_id):
|
||||||
|
"""
|
||||||
|
Provision default automation flows in Activepieces for a tenant.
|
||||||
|
Called after Activepieces connection is provisioned.
|
||||||
|
"""
|
||||||
|
from smoothschedule.identity.core.models import Tenant
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
# Only provision if Activepieces is configured
|
||||||
|
if not getattr(settings, 'ACTIVEPIECES_JWT_SECRET', ''):
|
||||||
|
logger.debug("Activepieces not configured, skipping default flows")
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
tenant = Tenant.objects.get(id=tenant_id)
|
||||||
|
|
||||||
|
# Check if tenant has the automation feature
|
||||||
|
if hasattr(tenant, 'has_feature') and not tenant.has_feature('can_use_automations'):
|
||||||
|
logger.debug(
|
||||||
|
f"Tenant {tenant.schema_name} doesn't have automation feature, "
|
||||||
|
"skipping default flows"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Import here to avoid circular imports
|
||||||
|
from smoothschedule.integrations.activepieces.services import (
|
||||||
|
get_activepieces_client,
|
||||||
|
)
|
||||||
|
from smoothschedule.integrations.activepieces.models import (
|
||||||
|
TenantActivepiecesProject,
|
||||||
|
TenantDefaultFlow,
|
||||||
|
)
|
||||||
|
from smoothschedule.integrations.activepieces.default_flows import (
|
||||||
|
get_all_flow_definitions,
|
||||||
|
get_sample_data_for_flow,
|
||||||
|
FLOW_VERSION,
|
||||||
|
)
|
||||||
|
from django_tenants.utils import schema_context
|
||||||
|
|
||||||
|
# Get or create Activepieces project for this tenant
|
||||||
|
try:
|
||||||
|
project = TenantActivepiecesProject.objects.get(tenant=tenant)
|
||||||
|
except TenantActivepiecesProject.DoesNotExist:
|
||||||
|
logger.warning(
|
||||||
|
f"No Activepieces project for tenant {tenant.schema_name}, "
|
||||||
|
"skipping default flows"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
client = get_activepieces_client()
|
||||||
|
|
||||||
|
# Get a session token for API calls
|
||||||
|
provisioning_token = client._generate_trust_token(tenant)
|
||||||
|
result = client._request(
|
||||||
|
"POST",
|
||||||
|
"/api/v1/authentication/django-trust",
|
||||||
|
data={"token": provisioning_token},
|
||||||
|
)
|
||||||
|
session_token = result.get("token")
|
||||||
|
project_id = result.get("projectId")
|
||||||
|
|
||||||
|
if not session_token:
|
||||||
|
logger.error(
|
||||||
|
f"Failed to get Activepieces session for tenant {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Create each default flow
|
||||||
|
flow_definitions = get_all_flow_definitions()
|
||||||
|
created_count = 0
|
||||||
|
|
||||||
|
with schema_context(tenant.schema_name):
|
||||||
|
for flow_type, flow_def in flow_definitions.items():
|
||||||
|
# Check if flow already exists
|
||||||
|
if TenantDefaultFlow.objects.filter(
|
||||||
|
tenant=tenant, flow_type=flow_type
|
||||||
|
).exists():
|
||||||
|
logger.debug(
|
||||||
|
f"Default flow {flow_type} already exists for tenant {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Create the flow in Activepieces
|
||||||
|
# Put default flows in a "Defaults" folder for organization
|
||||||
|
created_flow = client.create_flow(
|
||||||
|
project_id=project_id,
|
||||||
|
token=session_token,
|
||||||
|
flow_data={
|
||||||
|
"displayName": flow_def.get("displayName", flow_type),
|
||||||
|
"trigger": flow_def.get("trigger"),
|
||||||
|
},
|
||||||
|
folder_name="Defaults",
|
||||||
|
)
|
||||||
|
|
||||||
|
flow_id = created_flow.get("id")
|
||||||
|
if not flow_id:
|
||||||
|
logger.error(
|
||||||
|
f"Failed to create flow {flow_type} for tenant {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Save sample data for the trigger
|
||||||
|
sample_data = get_sample_data_for_flow(flow_type)
|
||||||
|
if sample_data:
|
||||||
|
try:
|
||||||
|
client.save_sample_data(
|
||||||
|
flow_id=flow_id,
|
||||||
|
token=session_token,
|
||||||
|
step_name="trigger",
|
||||||
|
sample_data=sample_data,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(
|
||||||
|
f"Failed to save sample data for flow {flow_type}: {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Publish the flow (locks version and enables)
|
||||||
|
try:
|
||||||
|
client.publish_flow(flow_id, session_token)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(
|
||||||
|
f"Failed to publish flow {flow_type}, enabling instead: {e}"
|
||||||
|
)
|
||||||
|
# Fallback to just enabling
|
||||||
|
client.update_flow_status(flow_id, session_token, enabled=True)
|
||||||
|
|
||||||
|
# Store the flow record in Django
|
||||||
|
TenantDefaultFlow.objects.create(
|
||||||
|
tenant=tenant,
|
||||||
|
flow_type=flow_type,
|
||||||
|
activepieces_flow_id=flow_id,
|
||||||
|
default_flow_json=flow_def,
|
||||||
|
version=FLOW_VERSION,
|
||||||
|
is_enabled=True,
|
||||||
|
is_modified=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
created_count += 1
|
||||||
|
logger.info(
|
||||||
|
f"Created default flow {flow_type} for tenant {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f"Failed to create flow {flow_type} for tenant {tenant.schema_name}: {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Provisioned {created_count} default flows for tenant: {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Tenant.DoesNotExist:
|
||||||
|
logger.error(f"Tenant {tenant_id} not found when provisioning default flows")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to provision default flows for tenant {tenant_id}: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
@receiver(post_save, sender='core.Tenant')
|
||||||
|
def provision_default_flows_on_tenant_create(sender, instance, created, **kwargs):
|
||||||
|
"""
|
||||||
|
Provision default automation flows when a new tenant is created.
|
||||||
|
|
||||||
|
This creates the standard email automation flows (confirmation, reminder,
|
||||||
|
thank you, payment confirmations) so tenants have working automations
|
||||||
|
out of the box.
|
||||||
|
|
||||||
|
Runs after Activepieces connection provisioning via on_commit with a delayed
|
||||||
|
lambda to ensure the connection exists first.
|
||||||
|
"""
|
||||||
|
if not created:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Skip public schema
|
||||||
|
if instance.schema_name == 'public':
|
||||||
|
return
|
||||||
|
|
||||||
|
tenant_id = instance.id
|
||||||
|
# Use a second on_commit to run after the connection provisioning
|
||||||
|
# The ordering of on_commit callbacks is preserved, so this will run
|
||||||
|
# after _provision_activepieces_connection
|
||||||
|
transaction.on_commit(lambda: _provision_default_flows_for_tenant(tenant_id))
|
||||||
|
|||||||
439
smoothschedule/smoothschedule/identity/core/tests/test_admin.py
Normal file
439
smoothschedule/smoothschedule/identity/core/tests/test_admin.py
Normal file
@@ -0,0 +1,439 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for identity/core/admin.py
|
||||||
|
|
||||||
|
Tests Django admin classes using mocks to avoid database hits.
|
||||||
|
Following the testing pyramid: prefer fast unit tests over slow integration tests.
|
||||||
|
"""
|
||||||
|
from unittest.mock import Mock, patch, MagicMock
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import pytest
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
|
||||||
|
class TestTenantAdmin:
|
||||||
|
"""Tests for TenantAdmin class."""
|
||||||
|
|
||||||
|
def test_user_count_displays_count(self):
|
||||||
|
"""Should display user count."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
admin = TenantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.users.count.return_value = 5
|
||||||
|
mock_tenant.get_limit.return_value = 10
|
||||||
|
|
||||||
|
result = admin.user_count(mock_tenant)
|
||||||
|
|
||||||
|
# Should format with green color when under limit
|
||||||
|
assert '5' in str(result)
|
||||||
|
assert 'green' in str(result)
|
||||||
|
|
||||||
|
def test_user_count_shows_red_when_at_limit(self):
|
||||||
|
"""Should show red color when at or over limit."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
admin = TenantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.users.count.return_value = 10
|
||||||
|
mock_tenant.get_limit.return_value = 10
|
||||||
|
|
||||||
|
result = admin.user_count(mock_tenant)
|
||||||
|
|
||||||
|
# Should format with red color when at limit
|
||||||
|
assert '10' in str(result)
|
||||||
|
assert 'red' in str(result)
|
||||||
|
|
||||||
|
def test_user_count_shows_green_for_unlimited(self):
|
||||||
|
"""Should show green color when unlimited (None or 0)."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
admin = TenantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.users.count.return_value = 100
|
||||||
|
mock_tenant.get_limit.return_value = None # Unlimited
|
||||||
|
|
||||||
|
result = admin.user_count(mock_tenant)
|
||||||
|
|
||||||
|
# Should format with green color for unlimited
|
||||||
|
assert '100' in str(result)
|
||||||
|
assert 'green' in str(result)
|
||||||
|
|
||||||
|
def test_user_count_shows_green_for_zero_limit(self):
|
||||||
|
"""Should show green color when limit is 0 (unlimited)."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
admin = TenantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.users.count.return_value = 50
|
||||||
|
mock_tenant.get_limit.return_value = 0 # 0 means unlimited
|
||||||
|
|
||||||
|
result = admin.user_count(mock_tenant)
|
||||||
|
|
||||||
|
# Should format with green color for unlimited
|
||||||
|
assert '50' in str(result)
|
||||||
|
assert 'green' in str(result)
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.admin.reverse')
|
||||||
|
def test_domain_list_creates_links(self, mock_reverse):
|
||||||
|
"""Should create clickable links for each domain."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
admin = TenantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_domain1 = Mock(domain='test1.example.com', pk=1)
|
||||||
|
mock_domain2 = Mock(domain='test2.example.com', pk=2)
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.domain_set.all.return_value = [mock_domain1, mock_domain2]
|
||||||
|
|
||||||
|
mock_reverse.side_effect = lambda *args, **kwargs: f"/admin/core/domain/{kwargs.get('args', [0])[0]}/"
|
||||||
|
|
||||||
|
result = admin.domain_list(mock_tenant)
|
||||||
|
|
||||||
|
# Should contain both domain names
|
||||||
|
assert 'test1.example.com' in str(result)
|
||||||
|
assert 'test2.example.com' in str(result)
|
||||||
|
# Should contain links
|
||||||
|
assert '<a href=' in str(result)
|
||||||
|
|
||||||
|
def test_domain_list_shows_dash_for_no_domains(self):
|
||||||
|
"""Should show dash when no domains exist."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
admin = TenantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.domain_set.all.return_value = []
|
||||||
|
|
||||||
|
result = admin.domain_list(mock_tenant)
|
||||||
|
|
||||||
|
assert result == '-'
|
||||||
|
|
||||||
|
|
||||||
|
class TestDomainAdmin:
|
||||||
|
"""Tests for DomainAdmin class."""
|
||||||
|
|
||||||
|
def test_verified_status_shows_verified(self):
|
||||||
|
"""Should show verified status with green checkmark."""
|
||||||
|
from smoothschedule.identity.core.admin import DomainAdmin
|
||||||
|
|
||||||
|
admin = DomainAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_domain = Mock()
|
||||||
|
mock_domain.is_verified.return_value = True
|
||||||
|
|
||||||
|
result = admin.verified_status(mock_domain)
|
||||||
|
|
||||||
|
# Should show verified with green color
|
||||||
|
assert 'Verified' in str(result)
|
||||||
|
assert 'green' in str(result)
|
||||||
|
assert '✓' in str(result)
|
||||||
|
|
||||||
|
def test_verified_status_shows_pending(self):
|
||||||
|
"""Should show pending status with orange warning."""
|
||||||
|
from smoothschedule.identity.core.admin import DomainAdmin
|
||||||
|
|
||||||
|
admin = DomainAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_domain = Mock()
|
||||||
|
mock_domain.is_verified.return_value = False
|
||||||
|
|
||||||
|
result = admin.verified_status(mock_domain)
|
||||||
|
|
||||||
|
# Should show pending with orange color
|
||||||
|
assert 'Pending' in str(result)
|
||||||
|
assert 'orange' in str(result)
|
||||||
|
assert '⚠' in str(result)
|
||||||
|
|
||||||
|
|
||||||
|
class TestPermissionGrantAdmin:
|
||||||
|
"""Tests for PermissionGrantAdmin class."""
|
||||||
|
|
||||||
|
def test_status_shows_revoked(self):
|
||||||
|
"""Should show revoked status with red cross."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
now = timezone.now()
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.revoked_at = now
|
||||||
|
mock_grant.is_active.return_value = False
|
||||||
|
|
||||||
|
result = admin.status(mock_grant)
|
||||||
|
|
||||||
|
# Should show revoked with red color
|
||||||
|
assert 'Revoked' in str(result)
|
||||||
|
assert 'red' in str(result)
|
||||||
|
assert '✗' in str(result)
|
||||||
|
|
||||||
|
def test_status_shows_active(self):
|
||||||
|
"""Should show active status with green checkmark."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.revoked_at = None
|
||||||
|
mock_grant.is_active.return_value = True
|
||||||
|
|
||||||
|
result = admin.status(mock_grant)
|
||||||
|
|
||||||
|
# Should show active with green color
|
||||||
|
assert 'Active' in str(result)
|
||||||
|
assert 'green' in str(result)
|
||||||
|
assert '✓' in str(result)
|
||||||
|
|
||||||
|
def test_status_shows_expired(self):
|
||||||
|
"""Should show expired status with gray symbol."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.revoked_at = None
|
||||||
|
mock_grant.is_active.return_value = False
|
||||||
|
|
||||||
|
result = admin.status(mock_grant)
|
||||||
|
|
||||||
|
# Should show expired with gray color
|
||||||
|
assert 'Expired' in str(result)
|
||||||
|
assert 'gray' in str(result)
|
||||||
|
assert '⊘' in str(result)
|
||||||
|
|
||||||
|
def test_time_left_shows_dash_when_none(self):
|
||||||
|
"""Should show dash when no time remaining."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.time_remaining.return_value = None
|
||||||
|
|
||||||
|
result = admin.time_left(mock_grant)
|
||||||
|
|
||||||
|
assert result == '-'
|
||||||
|
|
||||||
|
def test_time_left_shows_red_for_less_than_5_minutes(self):
|
||||||
|
"""Should show red color for less than 5 minutes remaining."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.time_remaining.return_value = timedelta(minutes=3)
|
||||||
|
|
||||||
|
result = admin.time_left(mock_grant)
|
||||||
|
|
||||||
|
# Should show 3 minutes in red
|
||||||
|
assert '3 min' in str(result)
|
||||||
|
assert 'red' in str(result)
|
||||||
|
|
||||||
|
def test_time_left_shows_orange_for_5_to_15_minutes(self):
|
||||||
|
"""Should show orange color for 5-15 minutes remaining."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.time_remaining.return_value = timedelta(minutes=10)
|
||||||
|
|
||||||
|
result = admin.time_left(mock_grant)
|
||||||
|
|
||||||
|
# Should show 10 minutes in orange
|
||||||
|
assert '10 min' in str(result)
|
||||||
|
assert 'orange' in str(result)
|
||||||
|
|
||||||
|
def test_time_left_shows_green_for_more_than_15_minutes(self):
|
||||||
|
"""Should show green color for more than 15 minutes remaining."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
mock_grant = Mock()
|
||||||
|
mock_grant.time_remaining.return_value = timedelta(minutes=30)
|
||||||
|
|
||||||
|
result = admin.time_left(mock_grant)
|
||||||
|
|
||||||
|
# Should show 30 minutes in green
|
||||||
|
assert '30 min' in str(result)
|
||||||
|
assert 'green' in str(result)
|
||||||
|
|
||||||
|
def test_revoke_grants_action_revokes_active_grants(self):
|
||||||
|
"""Should revoke active grants via admin action."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
# Create mock grants
|
||||||
|
mock_grant1 = Mock()
|
||||||
|
mock_grant1.is_active.return_value = True
|
||||||
|
|
||||||
|
mock_grant2 = Mock()
|
||||||
|
mock_grant2.is_active.return_value = False # Already inactive
|
||||||
|
|
||||||
|
mock_grant3 = Mock()
|
||||||
|
mock_grant3.is_active.return_value = True
|
||||||
|
|
||||||
|
mock_queryset = [mock_grant1, mock_grant2, mock_grant3]
|
||||||
|
mock_request = Mock()
|
||||||
|
|
||||||
|
# Mock message_user
|
||||||
|
admin.message_user = Mock()
|
||||||
|
|
||||||
|
admin.revoke_grants(mock_request, mock_queryset)
|
||||||
|
|
||||||
|
# Should revoke only active grants
|
||||||
|
mock_grant1.revoke.assert_called_once()
|
||||||
|
mock_grant2.revoke.assert_not_called()
|
||||||
|
mock_grant3.revoke.assert_called_once()
|
||||||
|
|
||||||
|
# Should show success message
|
||||||
|
admin.message_user.assert_called_once()
|
||||||
|
call_args = admin.message_user.call_args[0]
|
||||||
|
assert '2 permission grant(s)' in call_args[1]
|
||||||
|
|
||||||
|
def test_revoke_grants_action_handles_no_active_grants(self):
|
||||||
|
"""Should handle case where no grants are active."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
admin = PermissionGrantAdmin(Mock(), Mock())
|
||||||
|
|
||||||
|
# Create mock grants - all inactive
|
||||||
|
mock_grant1 = Mock()
|
||||||
|
mock_grant1.is_active.return_value = False
|
||||||
|
|
||||||
|
mock_grant2 = Mock()
|
||||||
|
mock_grant2.is_active.return_value = False
|
||||||
|
|
||||||
|
mock_queryset = [mock_grant1, mock_grant2]
|
||||||
|
mock_request = Mock()
|
||||||
|
|
||||||
|
# Mock message_user
|
||||||
|
admin.message_user = Mock()
|
||||||
|
|
||||||
|
admin.revoke_grants(mock_request, mock_queryset)
|
||||||
|
|
||||||
|
# Should not revoke any grants
|
||||||
|
mock_grant1.revoke.assert_not_called()
|
||||||
|
mock_grant2.revoke.assert_not_called()
|
||||||
|
|
||||||
|
# Should show message indicating 0 grants revoked
|
||||||
|
admin.message_user.assert_called_once()
|
||||||
|
call_args = admin.message_user.call_args[0]
|
||||||
|
assert '0 permission grant(s)' in call_args[1]
|
||||||
|
|
||||||
|
|
||||||
|
class TestAdminConfiguration:
|
||||||
|
"""Tests for admin configuration settings."""
|
||||||
|
|
||||||
|
def test_tenant_admin_list_display(self):
|
||||||
|
"""Should have correct list_display fields."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
expected_fields = [
|
||||||
|
'name',
|
||||||
|
'schema_name',
|
||||||
|
'is_active',
|
||||||
|
'created_on',
|
||||||
|
'user_count',
|
||||||
|
'domain_list',
|
||||||
|
]
|
||||||
|
|
||||||
|
assert TenantAdmin.list_display == expected_fields
|
||||||
|
|
||||||
|
def test_tenant_admin_readonly_fields(self):
|
||||||
|
"""Should have schema_name and created_on as readonly."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
assert 'schema_name' in TenantAdmin.readonly_fields
|
||||||
|
assert 'created_on' in TenantAdmin.readonly_fields
|
||||||
|
|
||||||
|
def test_domain_admin_list_display(self):
|
||||||
|
"""Should have correct list_display fields."""
|
||||||
|
from smoothschedule.identity.core.admin import DomainAdmin
|
||||||
|
|
||||||
|
expected_fields = [
|
||||||
|
'domain',
|
||||||
|
'tenant',
|
||||||
|
'is_primary',
|
||||||
|
'is_custom_domain',
|
||||||
|
'verified_status',
|
||||||
|
]
|
||||||
|
|
||||||
|
assert DomainAdmin.list_display == expected_fields
|
||||||
|
|
||||||
|
def test_permission_grant_admin_list_display(self):
|
||||||
|
"""Should have correct list_display fields."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
expected_fields = [
|
||||||
|
'id',
|
||||||
|
'grantor',
|
||||||
|
'grantee',
|
||||||
|
'action',
|
||||||
|
'granted_at',
|
||||||
|
'expires_at',
|
||||||
|
'status',
|
||||||
|
'time_left',
|
||||||
|
]
|
||||||
|
|
||||||
|
assert PermissionGrantAdmin.list_display == expected_fields
|
||||||
|
|
||||||
|
def test_permission_grant_admin_has_revoke_action(self):
|
||||||
|
"""Should have revoke_grants action configured."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
assert 'revoke_grants' in PermissionGrantAdmin.actions
|
||||||
|
|
||||||
|
def test_tenant_admin_fieldsets_structure(self):
|
||||||
|
"""Should have properly structured fieldsets."""
|
||||||
|
from smoothschedule.identity.core.admin import TenantAdmin
|
||||||
|
|
||||||
|
# Check that fieldsets exist
|
||||||
|
assert hasattr(TenantAdmin, 'fieldsets')
|
||||||
|
assert len(TenantAdmin.fieldsets) > 0
|
||||||
|
|
||||||
|
# Check for basic information section
|
||||||
|
fieldset_names = [fs[0] for fs in TenantAdmin.fieldsets]
|
||||||
|
assert 'Basic Information' in fieldset_names
|
||||||
|
|
||||||
|
def test_domain_admin_fieldsets_structure(self):
|
||||||
|
"""Should have properly structured fieldsets."""
|
||||||
|
from smoothschedule.identity.core.admin import DomainAdmin
|
||||||
|
|
||||||
|
# Check that fieldsets exist
|
||||||
|
assert hasattr(DomainAdmin, 'fieldsets')
|
||||||
|
assert len(DomainAdmin.fieldsets) > 0
|
||||||
|
|
||||||
|
# Check for custom domain settings section
|
||||||
|
fieldset_names = [fs[0] for fs in DomainAdmin.fieldsets]
|
||||||
|
assert 'Custom Domain Settings' in fieldset_names
|
||||||
|
|
||||||
|
def test_permission_grant_admin_fieldsets_structure(self):
|
||||||
|
"""Should have properly structured fieldsets."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
# Check that fieldsets exist
|
||||||
|
assert hasattr(PermissionGrantAdmin, 'fieldsets')
|
||||||
|
assert len(PermissionGrantAdmin.fieldsets) > 0
|
||||||
|
|
||||||
|
# Check for audit trail section
|
||||||
|
fieldset_names = [fs[0] for fs in PermissionGrantAdmin.fieldsets]
|
||||||
|
assert 'Audit Trail' in fieldset_names
|
||||||
|
|
||||||
|
def test_permission_grant_admin_readonly_audit_fields(self):
|
||||||
|
"""Should have audit fields as readonly."""
|
||||||
|
from smoothschedule.identity.core.admin import PermissionGrantAdmin
|
||||||
|
|
||||||
|
readonly = PermissionGrantAdmin.readonly_fields
|
||||||
|
|
||||||
|
assert 'granted_at' in readonly
|
||||||
|
assert 'grantor' in readonly
|
||||||
|
assert 'grantee' in readonly
|
||||||
|
assert 'ip_address' in readonly
|
||||||
|
assert 'user_agent' in readonly
|
||||||
@@ -0,0 +1,381 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for identity/core/services.py
|
||||||
|
|
||||||
|
Tests the StorageQuotaService class using mocks to avoid database hits.
|
||||||
|
Following the testing pyramid: prefer fast unit tests over slow integration tests.
|
||||||
|
"""
|
||||||
|
from unittest.mock import Mock, patch, MagicMock
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
|
||||||
|
class TestStorageQuotaServiceGetQuotaBytes:
|
||||||
|
"""Tests for get_quota_bytes static method."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.billing.services.entitlements.EntitlementService')
|
||||||
|
def test_get_quota_bytes_with_subscription(self, mock_entitlement_service):
|
||||||
|
"""Should get storage from billing subscription when available."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
# Mock tenant with subscription
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_tenant.billing_subscription = Mock()
|
||||||
|
mock_entitlement_service.get_feature_value.return_value = 5 # 5 GB
|
||||||
|
|
||||||
|
result = StorageQuotaService.get_quota_bytes(mock_tenant)
|
||||||
|
|
||||||
|
# 5 GB = 5 * 1024^3 bytes
|
||||||
|
expected_bytes = 5 * 1024 * 1024 * 1024
|
||||||
|
assert result == expected_bytes
|
||||||
|
mock_entitlement_service.get_feature_value.assert_called_once_with(
|
||||||
|
mock_tenant, 'storage_gb', default=1
|
||||||
|
)
|
||||||
|
|
||||||
|
@patch('smoothschedule.billing.services.entitlements.EntitlementService')
|
||||||
|
def test_get_quota_bytes_without_subscription(self, mock_entitlement_service):
|
||||||
|
"""Should use default storage when no subscription."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
# Mock tenant without subscription
|
||||||
|
mock_tenant = Mock(spec=['id'])
|
||||||
|
# Make hasattr return False for billing_subscription
|
||||||
|
mock_tenant.billing_subscription = None
|
||||||
|
|
||||||
|
result = StorageQuotaService.get_quota_bytes(mock_tenant)
|
||||||
|
|
||||||
|
# Default 1 GB = 1 * 1024^3 bytes
|
||||||
|
expected_bytes = 1 * 1024 * 1024 * 1024
|
||||||
|
assert result == expected_bytes
|
||||||
|
# Should not call EntitlementService
|
||||||
|
mock_entitlement_service.get_feature_value.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.billing.services.entitlements.EntitlementService')
|
||||||
|
def test_get_quota_bytes_tenant_without_attribute(self, mock_entitlement_service):
|
||||||
|
"""Should use default storage when tenant has no billing_subscription attribute."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
# Mock tenant without billing_subscription attribute
|
||||||
|
mock_tenant = Mock(spec=['id', 'name'])
|
||||||
|
|
||||||
|
result = StorageQuotaService.get_quota_bytes(mock_tenant)
|
||||||
|
|
||||||
|
# Default 1 GB
|
||||||
|
expected_bytes = 1 * 1024 * 1024 * 1024
|
||||||
|
assert result == expected_bytes
|
||||||
|
|
||||||
|
|
||||||
|
class TestStorageQuotaServiceGetUsage:
|
||||||
|
"""Tests for get_usage static method."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_quota_bytes')
|
||||||
|
def test_get_usage_returns_usage_dict(self, mock_get_quota, mock_usage_model):
|
||||||
|
"""Should return formatted usage dictionary."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
|
||||||
|
# Mock usage record
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage.bytes_used = 500 * 1024 * 1024 # 500 MB
|
||||||
|
mock_usage.file_count = 10
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
|
||||||
|
# Mock quota
|
||||||
|
mock_get_quota.return_value = 1024 * 1024 * 1024 # 1 GB
|
||||||
|
|
||||||
|
result = StorageQuotaService.get_usage(mock_tenant)
|
||||||
|
|
||||||
|
assert result['bytes_used'] == 500 * 1024 * 1024
|
||||||
|
assert result['bytes_total'] == 1024 * 1024 * 1024
|
||||||
|
assert result['file_count'] == 10
|
||||||
|
# 500 MB / 1024 MB * 100 = ~48.83%
|
||||||
|
assert result['percent_used'] == 48.83
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_quota_bytes')
|
||||||
|
def test_get_usage_creates_usage_if_not_exists(self, mock_get_quota, mock_usage_model):
|
||||||
|
"""Should create usage record if it doesn't exist."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock(bytes_used=0, file_count=0)
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, True)
|
||||||
|
mock_get_quota.return_value = 1024 * 1024 * 1024
|
||||||
|
|
||||||
|
StorageQuotaService.get_usage(mock_tenant)
|
||||||
|
|
||||||
|
mock_usage_model.objects.get_or_create.assert_called_once_with(tenant=mock_tenant)
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_quota_bytes')
|
||||||
|
def test_get_usage_handles_zero_quota(self, mock_get_quota, mock_usage_model):
|
||||||
|
"""Should handle zero quota without division error."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock(bytes_used=100, file_count=1)
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
mock_get_quota.return_value = 0
|
||||||
|
|
||||||
|
result = StorageQuotaService.get_usage(mock_tenant)
|
||||||
|
|
||||||
|
assert result['percent_used'] == 0
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_quota_bytes')
|
||||||
|
def test_get_usage_rounds_percent(self, mock_get_quota, mock_usage_model):
|
||||||
|
"""Should round percent_used to 2 decimal places."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock(bytes_used=333, file_count=1)
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
mock_get_quota.return_value = 1000
|
||||||
|
|
||||||
|
result = StorageQuotaService.get_usage(mock_tenant)
|
||||||
|
|
||||||
|
# 333/1000 * 100 = 33.3
|
||||||
|
assert result['percent_used'] == 33.3
|
||||||
|
|
||||||
|
|
||||||
|
class TestStorageQuotaServiceCanUpload:
|
||||||
|
"""Tests for can_upload static method."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_usage')
|
||||||
|
def test_can_upload_allows_when_under_quota(self, mock_get_usage):
|
||||||
|
"""Should allow upload when under quota."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_get_usage.return_value = {
|
||||||
|
'bytes_used': 500 * 1024 * 1024, # 500 MB used
|
||||||
|
'bytes_total': 1024 * 1024 * 1024, # 1 GB total
|
||||||
|
'file_count': 10,
|
||||||
|
'percent_used': 48.83
|
||||||
|
}
|
||||||
|
|
||||||
|
can_upload, error_msg = StorageQuotaService.can_upload(
|
||||||
|
mock_tenant,
|
||||||
|
100 * 1024 * 1024 # Try to upload 100 MB
|
||||||
|
)
|
||||||
|
|
||||||
|
assert can_upload is True
|
||||||
|
assert error_msg == ''
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_usage')
|
||||||
|
def test_can_upload_denies_when_over_quota(self, mock_get_usage):
|
||||||
|
"""Should deny upload when it would exceed quota."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_get_usage.return_value = {
|
||||||
|
'bytes_used': 900 * 1024 * 1024, # 900 MB used
|
||||||
|
'bytes_total': 1024 * 1024 * 1024, # 1 GB total
|
||||||
|
'file_count': 10,
|
||||||
|
'percent_used': 87.89
|
||||||
|
}
|
||||||
|
|
||||||
|
can_upload, error_msg = StorageQuotaService.can_upload(
|
||||||
|
mock_tenant,
|
||||||
|
200 * 1024 * 1024 # Try to upload 200 MB (would exceed)
|
||||||
|
)
|
||||||
|
|
||||||
|
assert can_upload is False
|
||||||
|
assert 'Storage quota exceeded' in error_msg
|
||||||
|
assert '124.0 MB remaining' in error_msg
|
||||||
|
assert '200.0 MB' in error_msg
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_usage')
|
||||||
|
def test_can_upload_allows_exact_quota(self, mock_get_usage):
|
||||||
|
"""Should allow upload when exactly at quota."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_get_usage.return_value = {
|
||||||
|
'bytes_used': 900 * 1024 * 1024,
|
||||||
|
'bytes_total': 1024 * 1024 * 1024,
|
||||||
|
'file_count': 10,
|
||||||
|
'percent_used': 87.89
|
||||||
|
}
|
||||||
|
|
||||||
|
can_upload, error_msg = StorageQuotaService.can_upload(
|
||||||
|
mock_tenant,
|
||||||
|
124 * 1024 * 1024 # Exactly fills quota
|
||||||
|
)
|
||||||
|
|
||||||
|
assert can_upload is True
|
||||||
|
assert error_msg == ''
|
||||||
|
|
||||||
|
|
||||||
|
class TestStorageQuotaServiceUpdateUsage:
|
||||||
|
"""Tests for update_usage static method."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
def test_update_usage_adds_bytes_on_positive_delta(self, mock_usage_model):
|
||||||
|
"""Should call add_file when bytes_delta is positive."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
|
||||||
|
StorageQuotaService.update_usage(mock_tenant, bytes_delta=1024, count_delta=1)
|
||||||
|
|
||||||
|
mock_usage.add_file.assert_called_once_with(1024)
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
def test_update_usage_removes_bytes_on_negative_delta(self, mock_usage_model):
|
||||||
|
"""Should call remove_file when bytes_delta is negative."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
|
||||||
|
StorageQuotaService.update_usage(mock_tenant, bytes_delta=-2048, count_delta=-1)
|
||||||
|
|
||||||
|
mock_usage.remove_file.assert_called_once_with(2048)
|
||||||
|
|
||||||
|
@patch('django.db.models.F')
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
def test_update_usage_handles_count_only_change(self, mock_usage_model, mock_f):
|
||||||
|
"""Should update count when bytes_delta is 0 but count_delta is not."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock(pk=123)
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
mock_usage_model.objects.filter.return_value.update = Mock()
|
||||||
|
|
||||||
|
StorageQuotaService.update_usage(mock_tenant, bytes_delta=0, count_delta=1)
|
||||||
|
|
||||||
|
# Should filter for the usage record and update count
|
||||||
|
mock_usage_model.objects.filter.assert_called_once_with(pk=123)
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
def test_update_usage_does_nothing_on_zero_deltas(self, mock_usage_model):
|
||||||
|
"""Should do nothing when both deltas are zero."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock()
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage.add_file = Mock()
|
||||||
|
mock_usage.remove_file = Mock()
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
|
||||||
|
StorageQuotaService.update_usage(mock_tenant, bytes_delta=0, count_delta=0)
|
||||||
|
|
||||||
|
# Should not call add_file or remove_file
|
||||||
|
mock_usage.add_file.assert_not_called()
|
||||||
|
mock_usage.remove_file.assert_not_called()
|
||||||
|
|
||||||
|
|
||||||
|
class TestStorageQuotaServiceRecalculateUsage:
|
||||||
|
"""Tests for recalculate_usage static method."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.MediaFile')
|
||||||
|
@patch('django.db.connection')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_usage')
|
||||||
|
def test_recalculate_usage_switches_schema(
|
||||||
|
self, mock_get_usage, mock_connection, mock_media_file, mock_usage_model
|
||||||
|
):
|
||||||
|
"""Should switch to tenant schema before querying."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock(schema_name='test_tenant')
|
||||||
|
mock_cursor = Mock()
|
||||||
|
mock_connection.cursor.return_value.__enter__ = Mock(return_value=mock_cursor)
|
||||||
|
mock_connection.cursor.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
# Mock aggregation
|
||||||
|
mock_media_file.objects.aggregate.return_value = {
|
||||||
|
'total_bytes': 1024,
|
||||||
|
'total_files': 5
|
||||||
|
}
|
||||||
|
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
mock_get_usage.return_value = {'bytes_used': 1024, 'bytes_total': 2048, 'file_count': 5, 'percent_used': 50.0}
|
||||||
|
|
||||||
|
StorageQuotaService.recalculate_usage(mock_tenant)
|
||||||
|
|
||||||
|
# Should set search_path to tenant schema
|
||||||
|
mock_cursor.execute.assert_called_once_with("SET search_path TO test_tenant")
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.MediaFile')
|
||||||
|
@patch('django.db.connection')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_usage')
|
||||||
|
def test_recalculate_usage_aggregates_media_files(
|
||||||
|
self, mock_get_usage, mock_connection, mock_media_file, mock_usage_model
|
||||||
|
):
|
||||||
|
"""Should aggregate file sizes and counts from MediaFile."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock(schema_name='test_tenant')
|
||||||
|
mock_cursor = Mock()
|
||||||
|
mock_connection.cursor.return_value.__enter__ = Mock(return_value=mock_cursor)
|
||||||
|
mock_connection.cursor.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
# Mock aggregation with Sum and Count
|
||||||
|
mock_media_file.objects.aggregate.return_value = {
|
||||||
|
'total_bytes': 5000,
|
||||||
|
'total_files': 10
|
||||||
|
}
|
||||||
|
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
mock_get_usage.return_value = {'bytes_used': 5000, 'bytes_total': 10000, 'file_count': 10, 'percent_used': 50.0}
|
||||||
|
|
||||||
|
result = StorageQuotaService.recalculate_usage(mock_tenant)
|
||||||
|
|
||||||
|
# Should update usage record with aggregated values
|
||||||
|
assert mock_usage.bytes_used == 5000
|
||||||
|
assert mock_usage.file_count == 10
|
||||||
|
mock_usage.save.assert_called_once()
|
||||||
|
|
||||||
|
# Should return updated usage
|
||||||
|
assert result['bytes_used'] == 5000
|
||||||
|
assert result['file_count'] == 10
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.models.TenantStorageUsage')
|
||||||
|
@patch('smoothschedule.scheduling.schedule.models.MediaFile')
|
||||||
|
@patch('django.db.connection')
|
||||||
|
@patch('smoothschedule.identity.core.services.StorageQuotaService.get_usage')
|
||||||
|
def test_recalculate_usage_handles_null_aggregation(
|
||||||
|
self, mock_get_usage, mock_connection, mock_media_file, mock_usage_model
|
||||||
|
):
|
||||||
|
"""Should handle null values from aggregation (no files)."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
mock_tenant = Mock(schema_name='test_tenant')
|
||||||
|
mock_cursor = Mock()
|
||||||
|
mock_connection.cursor.return_value.__enter__ = Mock(return_value=mock_cursor)
|
||||||
|
mock_connection.cursor.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
# Mock empty aggregation (no files)
|
||||||
|
mock_media_file.objects.aggregate.return_value = {
|
||||||
|
'total_bytes': None,
|
||||||
|
'total_files': None
|
||||||
|
}
|
||||||
|
|
||||||
|
mock_usage = Mock()
|
||||||
|
mock_usage_model.objects.get_or_create.return_value = (mock_usage, False)
|
||||||
|
mock_get_usage.return_value = {'bytes_used': 0, 'bytes_total': 1024, 'file_count': 0, 'percent_used': 0.0}
|
||||||
|
|
||||||
|
StorageQuotaService.recalculate_usage(mock_tenant)
|
||||||
|
|
||||||
|
# Should set to 0 when None
|
||||||
|
assert mock_usage.bytes_used == 0
|
||||||
|
assert mock_usage.file_count == 0
|
||||||
|
|
||||||
|
|
||||||
|
class TestStorageQuotaServiceDefaultConstant:
|
||||||
|
"""Tests for DEFAULT_STORAGE_GB constant."""
|
||||||
|
|
||||||
|
def test_default_storage_constant_value(self):
|
||||||
|
"""Should have correct default storage value."""
|
||||||
|
from smoothschedule.identity.core.services import StorageQuotaService
|
||||||
|
|
||||||
|
assert StorageQuotaService.DEFAULT_STORAGE_GB == 1
|
||||||
@@ -211,3 +211,340 @@ class TestSeedPlatformPluginsOnTenantCreate:
|
|||||||
seed_platform_plugins_on_tenant_create(Mock(), instance, created=True)
|
seed_platform_plugins_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
mock_seed.assert_called_once_with('new_tenant')
|
mock_seed.assert_called_once_with('new_tenant')
|
||||||
|
|
||||||
|
|
||||||
|
class TestCreateSiteForTenant:
|
||||||
|
"""Tests for _create_site_for_tenant function."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
@patch('smoothschedule.platform.tenant_sites.models.Site')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_creates_site_when_none_exists(self, mock_tenant_model, mock_site_model, mock_logger):
|
||||||
|
"""Should create Site when none exists for tenant."""
|
||||||
|
from smoothschedule.identity.core.signals import _create_site_for_tenant
|
||||||
|
|
||||||
|
mock_tenant = Mock(id=1, schema_name='test_tenant')
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
# No existing site
|
||||||
|
mock_site_model.objects.filter.return_value.exists.return_value = False
|
||||||
|
mock_site = Mock()
|
||||||
|
mock_site_model.objects.create.return_value = mock_site
|
||||||
|
|
||||||
|
_create_site_for_tenant(1)
|
||||||
|
|
||||||
|
# Should create site
|
||||||
|
mock_site_model.objects.create.assert_called_once_with(
|
||||||
|
tenant=mock_tenant,
|
||||||
|
is_enabled=True
|
||||||
|
)
|
||||||
|
mock_logger.info.assert_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
@patch('smoothschedule.platform.tenant_sites.models.Site')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_skips_creation_when_site_exists(self, mock_tenant_model, mock_site_model, mock_logger):
|
||||||
|
"""Should skip site creation when site already exists."""
|
||||||
|
from smoothschedule.identity.core.signals import _create_site_for_tenant
|
||||||
|
|
||||||
|
mock_tenant = Mock(id=1, schema_name='test_tenant')
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
# Site already exists
|
||||||
|
mock_site_model.objects.filter.return_value.exists.return_value = True
|
||||||
|
|
||||||
|
_create_site_for_tenant(1)
|
||||||
|
|
||||||
|
# Should not create site
|
||||||
|
mock_site_model.objects.create.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
@patch('smoothschedule.platform.tenant_sites.models.Site')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_logs_error_when_tenant_not_found(self, mock_tenant_model, mock_site_model, mock_logger):
|
||||||
|
"""Should log error when tenant doesn't exist."""
|
||||||
|
from smoothschedule.identity.core.signals import _create_site_for_tenant
|
||||||
|
from django.core.exceptions import ObjectDoesNotExist
|
||||||
|
|
||||||
|
# Use ObjectDoesNotExist which is a proper exception class
|
||||||
|
mock_tenant_model.DoesNotExist = ObjectDoesNotExist
|
||||||
|
mock_tenant_model.objects.get.side_effect = ObjectDoesNotExist
|
||||||
|
|
||||||
|
_create_site_for_tenant(999)
|
||||||
|
|
||||||
|
# Should log error
|
||||||
|
mock_logger.error.assert_called()
|
||||||
|
# Should not attempt to create site
|
||||||
|
mock_site_model.objects.create.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
@patch('smoothschedule.platform.tenant_sites.models.Site')
|
||||||
|
@patch('smoothschedule.identity.core.models.Tenant')
|
||||||
|
def test_logs_error_on_exception(self, mock_tenant_model, mock_site_model, mock_logger):
|
||||||
|
"""Should log error when exception occurs during site creation."""
|
||||||
|
from smoothschedule.identity.core.signals import _create_site_for_tenant
|
||||||
|
from django.core.exceptions import ObjectDoesNotExist
|
||||||
|
|
||||||
|
# Need to set DoesNotExist properly for the except clause to work
|
||||||
|
mock_tenant_model.DoesNotExist = ObjectDoesNotExist
|
||||||
|
|
||||||
|
mock_tenant = Mock(id=1, schema_name='test_tenant')
|
||||||
|
mock_tenant_model.objects.get.return_value = mock_tenant
|
||||||
|
|
||||||
|
# Simulate exception during creation
|
||||||
|
mock_site_model.objects.filter.return_value.exists.return_value = False
|
||||||
|
mock_site_model.objects.create.side_effect = Exception("Test error")
|
||||||
|
|
||||||
|
_create_site_for_tenant(1)
|
||||||
|
|
||||||
|
# Should log error
|
||||||
|
mock_logger.error.assert_called()
|
||||||
|
|
||||||
|
|
||||||
|
class TestCreateSiteOnTenantCreate:
|
||||||
|
"""Tests for create_site_on_tenant_create signal handler."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
def test_schedules_site_creation_on_commit(self, mock_transaction):
|
||||||
|
"""Should schedule site creation on transaction commit."""
|
||||||
|
from smoothschedule.identity.core.signals import create_site_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'tenant_schema'
|
||||||
|
instance.id = 123
|
||||||
|
|
||||||
|
create_site_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
|
mock_transaction.on_commit.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
def test_does_not_trigger_on_update(self, mock_transaction):
|
||||||
|
"""Should not trigger when tenant is updated (not created)."""
|
||||||
|
from smoothschedule.identity.core.signals import create_site_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'tenant_schema'
|
||||||
|
|
||||||
|
create_site_on_tenant_create(Mock(), instance, created=False)
|
||||||
|
|
||||||
|
mock_transaction.on_commit.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
def test_does_not_trigger_for_public_schema(self, mock_transaction):
|
||||||
|
"""Should not trigger for public schema."""
|
||||||
|
from smoothschedule.identity.core.signals import create_site_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'public'
|
||||||
|
|
||||||
|
create_site_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
|
mock_transaction.on_commit.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
@patch('smoothschedule.identity.core.signals._create_site_for_tenant')
|
||||||
|
def test_on_commit_calls_create_function(self, mock_create, mock_transaction):
|
||||||
|
"""Should call _create_site_for_tenant when transaction commits."""
|
||||||
|
from smoothschedule.identity.core.signals import create_site_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'new_tenant'
|
||||||
|
instance.id = 456
|
||||||
|
|
||||||
|
# Capture the callback passed to on_commit
|
||||||
|
def capture_callback(callback):
|
||||||
|
callback()
|
||||||
|
|
||||||
|
mock_transaction.on_commit.side_effect = capture_callback
|
||||||
|
|
||||||
|
create_site_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
|
mock_create.assert_called_once_with(456)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSeedEmailTemplatesForTenant:
|
||||||
|
"""Tests for _seed_email_templates_for_tenant function."""
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
def test_logs_start_of_seeding(self, mock_logger, mock_schema_context):
|
||||||
|
"""Should log when starting to seed email templates."""
|
||||||
|
from smoothschedule.identity.core.signals import _seed_email_templates_for_tenant
|
||||||
|
|
||||||
|
mock_schema_context.return_value.__enter__ = Mock()
|
||||||
|
mock_schema_context.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.models.PuckEmailTemplate') as mock_template:
|
||||||
|
mock_template.objects.filter.return_value.exists.return_value = True
|
||||||
|
with patch('smoothschedule.communication.messaging.email_types.EmailType', []):
|
||||||
|
_seed_email_templates_for_tenant('test_schema')
|
||||||
|
|
||||||
|
mock_logger.info.assert_called()
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
def test_creates_templates_that_dont_exist(self, mock_schema_context):
|
||||||
|
"""Should create email templates that don't already exist."""
|
||||||
|
from smoothschedule.identity.core.signals import _seed_email_templates_for_tenant
|
||||||
|
|
||||||
|
mock_schema_context.return_value.__enter__ = Mock()
|
||||||
|
mock_schema_context.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
# Mock EmailType enum
|
||||||
|
mock_email_type = Mock()
|
||||||
|
mock_email_type.value = 'APPOINTMENT_CONFIRMATION'
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.models.PuckEmailTemplate') as mock_template:
|
||||||
|
mock_template.objects.filter.return_value.exists.return_value = False
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_types.EmailType', [mock_email_type]):
|
||||||
|
with patch('smoothschedule.communication.messaging.default_templates.DEFAULT_TEMPLATES', {
|
||||||
|
'APPOINTMENT_CONFIRMATION': {
|
||||||
|
'subject_template': 'Test Subject',
|
||||||
|
'puck_data': {'content': [], 'root': {}}
|
||||||
|
}
|
||||||
|
}):
|
||||||
|
_seed_email_templates_for_tenant('test_schema')
|
||||||
|
|
||||||
|
mock_template.objects.create.assert_called_once()
|
||||||
|
call_kwargs = mock_template.objects.create.call_args[1]
|
||||||
|
assert call_kwargs['email_type'] == 'APPOINTMENT_CONFIRMATION'
|
||||||
|
assert call_kwargs['is_active'] is True
|
||||||
|
assert call_kwargs['is_customized'] is False
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
def test_skips_existing_templates(self, mock_schema_context):
|
||||||
|
"""Should skip templates that already exist."""
|
||||||
|
from smoothschedule.identity.core.signals import _seed_email_templates_for_tenant
|
||||||
|
|
||||||
|
mock_schema_context.return_value.__enter__ = Mock()
|
||||||
|
mock_schema_context.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
mock_email_type = Mock()
|
||||||
|
mock_email_type.value = 'EXISTING_TEMPLATE'
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.models.PuckEmailTemplate') as mock_template:
|
||||||
|
mock_template.objects.filter.return_value.exists.return_value = True
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_types.EmailType', [mock_email_type]):
|
||||||
|
_seed_email_templates_for_tenant('test_schema')
|
||||||
|
|
||||||
|
mock_template.objects.create.assert_not_called()
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
def test_logs_warning_for_missing_default_template(self, mock_logger, mock_schema_context):
|
||||||
|
"""Should log warning when default template data is missing."""
|
||||||
|
from smoothschedule.identity.core.signals import _seed_email_templates_for_tenant
|
||||||
|
|
||||||
|
mock_schema_context.return_value.__enter__ = Mock()
|
||||||
|
mock_schema_context.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
mock_email_type = Mock()
|
||||||
|
mock_email_type.value = 'UNKNOWN_TYPE'
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.models.PuckEmailTemplate') as mock_template:
|
||||||
|
mock_template.objects.filter.return_value.exists.return_value = False
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_types.EmailType', [mock_email_type]):
|
||||||
|
with patch('smoothschedule.communication.messaging.default_templates.DEFAULT_TEMPLATES', {}):
|
||||||
|
_seed_email_templates_for_tenant('test_schema')
|
||||||
|
|
||||||
|
mock_logger.warning.assert_called()
|
||||||
|
mock_template.objects.create.assert_not_called()
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
def test_logs_error_on_exception(self, mock_logger, mock_schema_context):
|
||||||
|
"""Should log error when exception occurs."""
|
||||||
|
from smoothschedule.identity.core.signals import _seed_email_templates_for_tenant
|
||||||
|
|
||||||
|
mock_schema_context.side_effect = Exception("Test error")
|
||||||
|
|
||||||
|
_seed_email_templates_for_tenant('test_schema')
|
||||||
|
|
||||||
|
mock_logger.error.assert_called()
|
||||||
|
|
||||||
|
@patch('django_tenants.utils.schema_context')
|
||||||
|
@patch('smoothschedule.identity.core.signals.logger')
|
||||||
|
def test_logs_created_count(self, mock_logger, mock_schema_context):
|
||||||
|
"""Should log the number of templates created."""
|
||||||
|
from smoothschedule.identity.core.signals import _seed_email_templates_for_tenant
|
||||||
|
|
||||||
|
mock_schema_context.return_value.__enter__ = Mock()
|
||||||
|
mock_schema_context.return_value.__exit__ = Mock(return_value=False)
|
||||||
|
|
||||||
|
mock_type1 = Mock(value='TYPE1')
|
||||||
|
mock_type2 = Mock(value='TYPE2')
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.models.PuckEmailTemplate') as mock_template:
|
||||||
|
mock_template.objects.filter.return_value.exists.return_value = False
|
||||||
|
|
||||||
|
with patch('smoothschedule.communication.messaging.email_types.EmailType', [mock_type1, mock_type2]):
|
||||||
|
with patch('smoothschedule.communication.messaging.default_templates.DEFAULT_TEMPLATES', {
|
||||||
|
'TYPE1': {'subject_template': 'S1', 'puck_data': {}},
|
||||||
|
'TYPE2': {'subject_template': 'S2', 'puck_data': {}}
|
||||||
|
}):
|
||||||
|
_seed_email_templates_for_tenant('test_schema')
|
||||||
|
|
||||||
|
# Should log info with created count
|
||||||
|
info_calls = [str(call) for call in mock_logger.info.call_args_list]
|
||||||
|
assert any('2' in str(call) for call in info_calls)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSeedEmailTemplatesOnTenantCreate:
|
||||||
|
"""Tests for seed_email_templates_on_tenant_create signal handler."""
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
def test_schedules_seeding_on_commit(self, mock_transaction):
|
||||||
|
"""Should schedule template seeding on transaction commit."""
|
||||||
|
from smoothschedule.identity.core.signals import seed_email_templates_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'tenant_schema'
|
||||||
|
|
||||||
|
seed_email_templates_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
|
mock_transaction.on_commit.assert_called_once()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
def test_does_not_trigger_on_update(self, mock_transaction):
|
||||||
|
"""Should not trigger when tenant is updated (not created)."""
|
||||||
|
from smoothschedule.identity.core.signals import seed_email_templates_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'tenant_schema'
|
||||||
|
|
||||||
|
seed_email_templates_on_tenant_create(Mock(), instance, created=False)
|
||||||
|
|
||||||
|
mock_transaction.on_commit.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
def test_does_not_trigger_for_public_schema(self, mock_transaction):
|
||||||
|
"""Should not trigger for public schema."""
|
||||||
|
from smoothschedule.identity.core.signals import seed_email_templates_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'public'
|
||||||
|
|
||||||
|
seed_email_templates_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
|
mock_transaction.on_commit.assert_not_called()
|
||||||
|
|
||||||
|
@patch('smoothschedule.identity.core.signals.transaction')
|
||||||
|
@patch('smoothschedule.identity.core.signals._seed_email_templates_for_tenant')
|
||||||
|
def test_on_commit_calls_seed_function(self, mock_seed, mock_transaction):
|
||||||
|
"""Should call _seed_email_templates_for_tenant when transaction commits."""
|
||||||
|
from smoothschedule.identity.core.signals import seed_email_templates_on_tenant_create
|
||||||
|
|
||||||
|
instance = Mock()
|
||||||
|
instance.schema_name = 'new_tenant'
|
||||||
|
|
||||||
|
# Capture the callback passed to on_commit
|
||||||
|
def capture_callback(callback):
|
||||||
|
callback()
|
||||||
|
|
||||||
|
mock_transaction.on_commit.side_effect = capture_callback
|
||||||
|
|
||||||
|
seed_email_templates_on_tenant_create(Mock(), instance, created=True)
|
||||||
|
|
||||||
|
mock_seed.assert_called_once_with('new_tenant')
|
||||||
|
|||||||
@@ -0,0 +1,537 @@
|
|||||||
|
"""
|
||||||
|
Default flow definitions for auto-provisioning.
|
||||||
|
|
||||||
|
Each flow uses SmoothSchedule piece triggers and actions to create
|
||||||
|
standard email automation workflows for every tenant.
|
||||||
|
|
||||||
|
Flow structure follows Activepieces format:
|
||||||
|
- trigger: The trigger configuration (polling or webhook)
|
||||||
|
- Each action is nested in the previous step's "nextAction" field
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Dict, Any
|
||||||
|
|
||||||
|
# Version for tracking upgrades
|
||||||
|
# 1.0.0 - Initial default flows
|
||||||
|
# 1.1.0 - Fixed context variable names to match email template tags
|
||||||
|
FLOW_VERSION = "1.1.0"
|
||||||
|
|
||||||
|
# System email types for the send_email action
|
||||||
|
EMAIL_TYPES = {
|
||||||
|
"appointment_confirmation": "appointment_confirmation",
|
||||||
|
"appointment_reminder": "appointment_reminder",
|
||||||
|
"thank_you": "payment_receipt", # Use payment_receipt template for thank you
|
||||||
|
"payment_receipt": "payment_receipt",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Sample data for each flow type - used to pre-populate trigger outputs
|
||||||
|
SAMPLE_DATA = {
|
||||||
|
"event_created": {
|
||||||
|
"id": 12345,
|
||||||
|
"title": "Consultation with John Doe",
|
||||||
|
"start_time": "2024-12-15T14:00:00Z",
|
||||||
|
"end_time": "2024-12-15T15:00:00Z",
|
||||||
|
"status": "SCHEDULED",
|
||||||
|
"resource_name": "Dr. Smith",
|
||||||
|
"location": "Main Office",
|
||||||
|
"service": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Consultation",
|
||||||
|
"price": "200.00",
|
||||||
|
},
|
||||||
|
"customer": {
|
||||||
|
"id": 50,
|
||||||
|
"first_name": "John",
|
||||||
|
"last_name": "Doe",
|
||||||
|
"email": "john.doe@example.com",
|
||||||
|
"phone": "+1-555-0100",
|
||||||
|
},
|
||||||
|
"event": {
|
||||||
|
"id": 12345,
|
||||||
|
"title": "Consultation with John Doe",
|
||||||
|
"start_time": "2024-12-15T14:00:00Z",
|
||||||
|
"end_time": "2024-12-15T15:00:00Z",
|
||||||
|
"status": "SCHEDULED",
|
||||||
|
"resource_name": "Dr. Smith",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"upcoming_events": {
|
||||||
|
"id": 12345,
|
||||||
|
"title": "Consultation with John Doe",
|
||||||
|
"start_time": "2024-12-15T14:00:00Z",
|
||||||
|
"end_time": "2024-12-15T15:00:00Z",
|
||||||
|
"status": "SCHEDULED",
|
||||||
|
"hours_until_start": 23.5,
|
||||||
|
"reminder_hours_before": 24,
|
||||||
|
"should_send_reminder": True,
|
||||||
|
"resource_name": "Dr. Smith",
|
||||||
|
"location": "Main Office",
|
||||||
|
"location_address": "123 Main St, City, ST 12345",
|
||||||
|
"service": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Consultation",
|
||||||
|
"price": "200.00",
|
||||||
|
},
|
||||||
|
"customer": {
|
||||||
|
"id": 50,
|
||||||
|
"first_name": "John",
|
||||||
|
"last_name": "Doe",
|
||||||
|
"email": "john.doe@example.com",
|
||||||
|
"phone": "+1-555-0100",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"payment_received": {
|
||||||
|
"id": 12345,
|
||||||
|
"payment_intent_id": "pi_3QDEr5GvIfP3a7s90bcd1234",
|
||||||
|
"amount": "50.00",
|
||||||
|
"currency": "usd",
|
||||||
|
"type": "deposit",
|
||||||
|
"status": "SUCCEEDED",
|
||||||
|
"created_at": "2024-12-01T10:00:00Z",
|
||||||
|
"completed_at": "2024-12-01T10:00:05Z",
|
||||||
|
"event": {
|
||||||
|
"id": 100,
|
||||||
|
"title": "Consultation with John Doe",
|
||||||
|
"start_time": "2024-12-15T14:00:00Z",
|
||||||
|
"end_time": "2024-12-15T15:00:00Z",
|
||||||
|
"status": "SCHEDULED",
|
||||||
|
"deposit_amount": "50.00",
|
||||||
|
"final_price": "200.00",
|
||||||
|
"remaining_balance": "150.00",
|
||||||
|
},
|
||||||
|
"service": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Consultation",
|
||||||
|
"price": "200.00",
|
||||||
|
},
|
||||||
|
"customer": {
|
||||||
|
"id": 50,
|
||||||
|
"first_name": "John",
|
||||||
|
"last_name": "Doe",
|
||||||
|
"email": "john.doe@example.com",
|
||||||
|
"phone": "+1-555-0100",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Map flow types to their trigger names and sample data
|
||||||
|
FLOW_SAMPLE_DATA = {
|
||||||
|
"appointment_confirmation": SAMPLE_DATA["event_created"],
|
||||||
|
"appointment_reminder": SAMPLE_DATA["upcoming_events"],
|
||||||
|
"thank_you": SAMPLE_DATA["payment_received"],
|
||||||
|
"payment_deposit": SAMPLE_DATA["payment_received"],
|
||||||
|
"payment_final": SAMPLE_DATA["payment_received"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_sample_data_for_flow(flow_type: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get the sample data for a given flow type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_type: One of the FlowType choices from TenantDefaultFlow model
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Sample data dict for the flow's trigger
|
||||||
|
"""
|
||||||
|
return FLOW_SAMPLE_DATA.get(flow_type, {})
|
||||||
|
|
||||||
|
|
||||||
|
def _create_send_email_action(
|
||||||
|
email_type: str,
|
||||||
|
step_name: str = "send_email",
|
||||||
|
next_action: Dict[str, Any] = None,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Create a send_email action step.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
email_type: The system email template to use (e.g., "appointment_confirmation")
|
||||||
|
step_name: Unique step name
|
||||||
|
next_action: Optional next action in the chain
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Action definition dict
|
||||||
|
"""
|
||||||
|
action = {
|
||||||
|
"name": step_name,
|
||||||
|
"displayName": "Send Email",
|
||||||
|
"type": "PIECE",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"actionName": "send_email",
|
||||||
|
"input": {
|
||||||
|
# These use Activepieces interpolation syntax
|
||||||
|
# {{trigger.customer.email}} references the trigger output
|
||||||
|
"to_email": "{{trigger.customer.email}}",
|
||||||
|
"template_type": "system",
|
||||||
|
"email_type": email_type,
|
||||||
|
"context": {
|
||||||
|
# Map trigger data to template context
|
||||||
|
# Use template tag names that match email_tags.py definitions
|
||||||
|
"customer_first_name": "{{trigger.customer.first_name}}",
|
||||||
|
"customer_last_name": "{{trigger.customer.last_name}}",
|
||||||
|
"customer_name": "{{trigger.customer.first_name}} {{trigger.customer.last_name}}",
|
||||||
|
"customer_email": "{{trigger.customer.email}}",
|
||||||
|
"customer_phone": "{{trigger.customer.phone}}",
|
||||||
|
"service_name": "{{trigger.service.name}}",
|
||||||
|
# Appointment fields
|
||||||
|
"appointment_date": "{{trigger.event.start_time}}",
|
||||||
|
"appointment_time": "{{trigger.event.start_time}}",
|
||||||
|
"appointment_datetime": "{{trigger.event.start_time}}",
|
||||||
|
"staff_name": "{{trigger.event.resource_name}}",
|
||||||
|
"location_name": "{{trigger.event.location}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
if next_action:
|
||||||
|
action["nextAction"] = next_action
|
||||||
|
|
||||||
|
return action
|
||||||
|
|
||||||
|
|
||||||
|
def get_appointment_confirmation_flow() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Appointment Confirmation Flow
|
||||||
|
|
||||||
|
Trigger: When a new event is created with status SCHEDULED
|
||||||
|
Action: Send appointment confirmation email
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"displayName": "Appointment Confirmation Email",
|
||||||
|
"description": "Automatically send a confirmation email when an appointment is booked",
|
||||||
|
"trigger": {
|
||||||
|
"name": "trigger",
|
||||||
|
"displayName": "Event Created",
|
||||||
|
"type": "PIECE_TRIGGER",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"triggerName": "event_created",
|
||||||
|
"input": {},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"nextAction": _create_send_email_action(
|
||||||
|
email_type=EMAIL_TYPES["appointment_confirmation"],
|
||||||
|
step_name="send_confirmation_email",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
"schemaVersion": "1",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_appointment_reminder_flow() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Appointment Reminder Flow
|
||||||
|
|
||||||
|
Trigger: Upcoming events (based on service reminder settings)
|
||||||
|
Action: Send reminder email
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"displayName": "Appointment Reminder Email",
|
||||||
|
"description": "Send reminder emails before appointments (based on service settings)",
|
||||||
|
"trigger": {
|
||||||
|
"name": "trigger",
|
||||||
|
"displayName": "Upcoming Event",
|
||||||
|
"type": "PIECE_TRIGGER",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"triggerName": "upcoming_events",
|
||||||
|
"input": {
|
||||||
|
"hoursAhead": 24,
|
||||||
|
"onlyIfReminderEnabled": True,
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"nextAction": {
|
||||||
|
"name": "send_reminder_email",
|
||||||
|
"displayName": "Send Reminder Email",
|
||||||
|
"type": "PIECE",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"actionName": "send_email",
|
||||||
|
"input": {
|
||||||
|
"to_email": "{{trigger.customer.email}}",
|
||||||
|
"template_type": "system",
|
||||||
|
"email_type": EMAIL_TYPES["appointment_reminder"],
|
||||||
|
"context": {
|
||||||
|
"customer_first_name": "{{trigger.customer.first_name}}",
|
||||||
|
"customer_last_name": "{{trigger.customer.last_name}}",
|
||||||
|
"customer_name": "{{trigger.customer.first_name}} {{trigger.customer.last_name}}",
|
||||||
|
"customer_email": "{{trigger.customer.email}}",
|
||||||
|
"customer_phone": "{{trigger.customer.phone}}",
|
||||||
|
"service_name": "{{trigger.service.name}}",
|
||||||
|
"appointment_date": "{{trigger.start_time}}",
|
||||||
|
"appointment_time": "{{trigger.start_time}}",
|
||||||
|
"appointment_datetime": "{{trigger.start_time}}",
|
||||||
|
"staff_name": "{{trigger.resource_name}}",
|
||||||
|
"location_name": "{{trigger.location}}",
|
||||||
|
"location_address": "{{trigger.location_address}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"schemaVersion": "1",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_thank_you_flow() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Thank You Email Flow
|
||||||
|
|
||||||
|
Trigger: When a final payment is received
|
||||||
|
Action: Send thank you email
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"displayName": "Thank You Email (After Payment)",
|
||||||
|
"description": "Send a thank you email when final payment is completed",
|
||||||
|
"trigger": {
|
||||||
|
"name": "trigger",
|
||||||
|
"displayName": "Payment Received",
|
||||||
|
"type": "PIECE_TRIGGER",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"triggerName": "payment_received",
|
||||||
|
"input": {
|
||||||
|
"paymentType": "final",
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"nextAction": {
|
||||||
|
"name": "send_thank_you_email",
|
||||||
|
"displayName": "Send Thank You Email",
|
||||||
|
"type": "PIECE",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"actionName": "send_email",
|
||||||
|
"input": {
|
||||||
|
"to_email": "{{trigger.customer.email}}",
|
||||||
|
"template_type": "system",
|
||||||
|
"email_type": EMAIL_TYPES["thank_you"],
|
||||||
|
"context": {
|
||||||
|
"customer_first_name": "{{trigger.customer.first_name}}",
|
||||||
|
"customer_last_name": "{{trigger.customer.last_name}}",
|
||||||
|
"customer_name": "{{trigger.customer.first_name}} {{trigger.customer.last_name}}",
|
||||||
|
"customer_email": "{{trigger.customer.email}}",
|
||||||
|
"customer_phone": "{{trigger.customer.phone}}",
|
||||||
|
"service_name": "{{trigger.service.name}}",
|
||||||
|
"amount_paid": "{{trigger.amount}}",
|
||||||
|
"invoice_number": "{{trigger.payment_intent_id}}",
|
||||||
|
"appointment_date": "{{trigger.event.start_time}}",
|
||||||
|
"appointment_datetime": "{{trigger.event.start_time}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"schemaVersion": "1",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_deposit_payment_flow() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Deposit Payment Confirmation Flow
|
||||||
|
|
||||||
|
Trigger: When a deposit payment is received
|
||||||
|
Action: Send payment receipt email with deposit-specific subject
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"displayName": "Deposit Payment Confirmation",
|
||||||
|
"description": "Send a confirmation when a deposit payment is received",
|
||||||
|
"trigger": {
|
||||||
|
"name": "trigger",
|
||||||
|
"displayName": "Payment Received",
|
||||||
|
"type": "PIECE_TRIGGER",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"triggerName": "payment_received",
|
||||||
|
"input": {
|
||||||
|
"paymentType": "deposit",
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"nextAction": {
|
||||||
|
"name": "send_deposit_confirmation",
|
||||||
|
"displayName": "Send Deposit Confirmation",
|
||||||
|
"type": "PIECE",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"actionName": "send_email",
|
||||||
|
"input": {
|
||||||
|
"to_email": "{{trigger.customer.email}}",
|
||||||
|
"template_type": "system",
|
||||||
|
"email_type": EMAIL_TYPES["payment_receipt"],
|
||||||
|
"subject_override": "Deposit Received - {{trigger.service.name}}",
|
||||||
|
"context": {
|
||||||
|
"customer_first_name": "{{trigger.customer.first_name}}",
|
||||||
|
"customer_last_name": "{{trigger.customer.last_name}}",
|
||||||
|
"customer_name": "{{trigger.customer.first_name}} {{trigger.customer.last_name}}",
|
||||||
|
"customer_email": "{{trigger.customer.email}}",
|
||||||
|
"customer_phone": "{{trigger.customer.phone}}",
|
||||||
|
"service_name": "{{trigger.service.name}}",
|
||||||
|
"amount_paid": "{{trigger.amount}}",
|
||||||
|
"invoice_number": "{{trigger.payment_intent_id}}",
|
||||||
|
"deposit_amount": "{{trigger.amount}}",
|
||||||
|
"total_paid": "{{trigger.amount}}",
|
||||||
|
"appointment_date": "{{trigger.event.start_time}}",
|
||||||
|
"appointment_datetime": "{{trigger.event.start_time}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"schemaVersion": "1",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_final_payment_flow() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Final Payment Confirmation Flow
|
||||||
|
|
||||||
|
Trigger: When a final payment is received
|
||||||
|
Action: Send payment receipt email
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"displayName": "Final Payment Confirmation",
|
||||||
|
"description": "Send a confirmation when the final payment is received",
|
||||||
|
"trigger": {
|
||||||
|
"name": "trigger",
|
||||||
|
"displayName": "Payment Received",
|
||||||
|
"type": "PIECE_TRIGGER",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"triggerName": "payment_received",
|
||||||
|
"input": {
|
||||||
|
"paymentType": "final",
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"nextAction": {
|
||||||
|
"name": "send_payment_confirmation",
|
||||||
|
"displayName": "Send Payment Confirmation",
|
||||||
|
"type": "PIECE",
|
||||||
|
"valid": True,
|
||||||
|
"settings": {
|
||||||
|
"pieceName": "@activepieces/piece-smoothschedule",
|
||||||
|
"pieceVersion": "~0.0.1",
|
||||||
|
"pieceType": "CUSTOM",
|
||||||
|
"actionName": "send_email",
|
||||||
|
"input": {
|
||||||
|
"to_email": "{{trigger.customer.email}}",
|
||||||
|
"template_type": "system",
|
||||||
|
"email_type": EMAIL_TYPES["payment_receipt"],
|
||||||
|
"context": {
|
||||||
|
"customer_first_name": "{{trigger.customer.first_name}}",
|
||||||
|
"customer_last_name": "{{trigger.customer.last_name}}",
|
||||||
|
"customer_name": "{{trigger.customer.first_name}} {{trigger.customer.last_name}}",
|
||||||
|
"customer_email": "{{trigger.customer.email}}",
|
||||||
|
"customer_phone": "{{trigger.customer.phone}}",
|
||||||
|
"service_name": "{{trigger.service.name}}",
|
||||||
|
"amount_paid": "{{trigger.amount}}",
|
||||||
|
"invoice_number": "{{trigger.payment_intent_id}}",
|
||||||
|
"total_paid": "{{trigger.amount}}",
|
||||||
|
"appointment_date": "{{trigger.event.start_time}}",
|
||||||
|
"appointment_datetime": "{{trigger.event.start_time}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"inputUiInfo": {
|
||||||
|
"customizedInputs": {},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"schemaVersion": "1",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# Mapping of flow types to their definition functions
|
||||||
|
FLOW_TYPE_DEFINITIONS = {
|
||||||
|
"appointment_confirmation": get_appointment_confirmation_flow,
|
||||||
|
"appointment_reminder": get_appointment_reminder_flow,
|
||||||
|
"thank_you": get_thank_you_flow,
|
||||||
|
"payment_deposit": get_deposit_payment_flow,
|
||||||
|
"payment_final": get_final_payment_flow,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_flow_definition(flow_type: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get the flow definition for a given flow type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_type: One of the FlowType choices from TenantDefaultFlow model
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Flow definition dict ready for Activepieces API
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If flow_type is not recognized
|
||||||
|
"""
|
||||||
|
if flow_type not in FLOW_TYPE_DEFINITIONS:
|
||||||
|
raise ValueError(f"Unknown flow type: {flow_type}")
|
||||||
|
|
||||||
|
return FLOW_TYPE_DEFINITIONS[flow_type]()
|
||||||
|
|
||||||
|
|
||||||
|
def get_all_flow_definitions() -> Dict[str, Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get all flow definitions.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping flow_type to its definition
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
flow_type: get_func()
|
||||||
|
for flow_type, get_func in FLOW_TYPE_DEFINITIONS.items()
|
||||||
|
}
|
||||||
@@ -0,0 +1,150 @@
|
|||||||
|
"""
|
||||||
|
Management command to provision SmoothSchedule connections in Activepieces
|
||||||
|
for all existing tenants.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
docker compose -f docker-compose.local.yml exec django \
|
||||||
|
python manage.py provision_ap_connections
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--tenant SCHEMA_NAME Only provision for a specific tenant
|
||||||
|
--dry-run Show what would be done without making changes
|
||||||
|
"""
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
from smoothschedule.identity.core.models import Tenant
|
||||||
|
from smoothschedule.integrations.activepieces.services import provision_tenant_connection
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Provision SmoothSchedule connections in Activepieces for existing tenants"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
"--tenant",
|
||||||
|
type=str,
|
||||||
|
help="Only provision for a specific tenant (schema_name)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--dry-run",
|
||||||
|
action="store_true",
|
||||||
|
help="Show what would be done without making changes",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--force",
|
||||||
|
action="store_true",
|
||||||
|
help="Re-provision even if connection already exists",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--delay",
|
||||||
|
type=float,
|
||||||
|
default=1.0,
|
||||||
|
help="Delay between tenant provisioning in seconds (default: 1.0)",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
tenant_filter = options["tenant"]
|
||||||
|
dry_run = options["dry_run"]
|
||||||
|
force = options["force"]
|
||||||
|
delay = options["delay"]
|
||||||
|
|
||||||
|
# Check if Activepieces is configured
|
||||||
|
if not getattr(settings, "ACTIVEPIECES_JWT_SECRET", ""):
|
||||||
|
self.stderr.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
"Activepieces is not configured. "
|
||||||
|
"Set ACTIVEPIECES_JWT_SECRET in your environment."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get tenants to provision
|
||||||
|
if tenant_filter:
|
||||||
|
tenants = Tenant.objects.filter(schema_name=tenant_filter)
|
||||||
|
if not tenants.exists():
|
||||||
|
self.stderr.write(
|
||||||
|
self.style.ERROR(f"Tenant '{tenant_filter}' not found")
|
||||||
|
)
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
# Exclude public schema
|
||||||
|
tenants = Tenant.objects.exclude(schema_name="public")
|
||||||
|
|
||||||
|
total_count = tenants.count()
|
||||||
|
self.stdout.write(f"Found {total_count} tenant(s) to process")
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
self.stdout.write(self.style.WARNING("DRY RUN - no changes will be made"))
|
||||||
|
|
||||||
|
success_count = 0
|
||||||
|
skip_count = 0
|
||||||
|
error_count = 0
|
||||||
|
|
||||||
|
for i, tenant in enumerate(tenants, 1):
|
||||||
|
self.stdout.write(f"\n[{i}/{total_count}] Processing tenant: {tenant.schema_name}")
|
||||||
|
|
||||||
|
# Check if tenant already has a connection (unless force is set)
|
||||||
|
if not force:
|
||||||
|
from smoothschedule.integrations.activepieces.models import TenantActivepiecesProject
|
||||||
|
if TenantActivepiecesProject.objects.filter(tenant=tenant).exists():
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(
|
||||||
|
f" Skipping - already has Activepieces project "
|
||||||
|
f"(use --force to re-provision)"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
skip_count += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check feature access (skip this check if --force is used)
|
||||||
|
if not force and hasattr(tenant, 'has_feature') and not tenant.has_feature('can_use_plugins'):
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(
|
||||||
|
f" Skipping - tenant doesn't have automation feature (use --force to bypass)"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
skip_count += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(f" Would provision connection for {tenant.name}")
|
||||||
|
)
|
||||||
|
success_count += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Actually provision the connection
|
||||||
|
try:
|
||||||
|
success = provision_tenant_connection(tenant)
|
||||||
|
if success:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(f" Successfully provisioned connection")
|
||||||
|
)
|
||||||
|
success_count += 1
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(f" Failed to provision connection")
|
||||||
|
)
|
||||||
|
error_count += 1
|
||||||
|
except Exception as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(f" Error: {e}")
|
||||||
|
)
|
||||||
|
error_count += 1
|
||||||
|
|
||||||
|
# Add delay between tenants to avoid overwhelming Activepieces
|
||||||
|
if i < total_count and delay > 0:
|
||||||
|
time.sleep(delay)
|
||||||
|
|
||||||
|
# Summary
|
||||||
|
self.stdout.write("\n" + "=" * 50)
|
||||||
|
self.stdout.write(f"Provisioning complete:")
|
||||||
|
self.stdout.write(self.style.SUCCESS(f" Success: {success_count}"))
|
||||||
|
self.stdout.write(self.style.WARNING(f" Skipped: {skip_count}"))
|
||||||
|
if error_count > 0:
|
||||||
|
self.stdout.write(self.style.ERROR(f" Errors: {error_count}"))
|
||||||
@@ -0,0 +1,36 @@
|
|||||||
|
# Generated by Django 5.2.8 on 2025-12-21 21:22
|
||||||
|
|
||||||
|
import django.db.models.deletion
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('activepieces', '0001_initial'),
|
||||||
|
('core', '0030_add_sidebar_text_color'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='TenantDefaultFlow',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('flow_type', models.CharField(choices=[('appointment_confirmation', 'Appointment Confirmation Email'), ('appointment_reminder', 'Appointment Reminder'), ('thank_you', 'Thank You Email (After Payment)'), ('payment_deposit', 'Deposit Payment Confirmation'), ('payment_final', 'Final Payment Confirmation')], help_text='Type of default flow', max_length=50)),
|
||||||
|
('activepieces_flow_id', models.CharField(help_text='The Activepieces flow ID', max_length=255)),
|
||||||
|
('is_modified', models.BooleanField(default=False, help_text='Whether user has modified this flow from default')),
|
||||||
|
('default_flow_json', models.JSONField(help_text='The original default flow definition for restore')),
|
||||||
|
('version', models.CharField(default='1.0.0', help_text='Version of the default flow template', max_length=20)),
|
||||||
|
('is_enabled', models.BooleanField(default=True, help_text='Whether this flow is enabled in Activepieces')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True)),
|
||||||
|
('tenant', models.ForeignKey(help_text='The tenant this default flow belongs to', on_delete=django.db.models.deletion.CASCADE, related_name='default_flows', to='core.tenant')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Tenant Default Flow',
|
||||||
|
'verbose_name_plural': 'Tenant Default Flows',
|
||||||
|
'indexes': [models.Index(fields=['tenant', 'flow_type'], name='activepiece_tenant__a23277_idx'), models.Index(fields=['activepieces_flow_id'], name='activepiece_activep_452ab8_idx')],
|
||||||
|
'unique_together': {('tenant', 'flow_type')},
|
||||||
|
},
|
||||||
|
),
|
||||||
|
]
|
||||||
@@ -74,3 +74,68 @@ class TenantActivepiecesUser(models.Model):
|
|||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.user.email} -> {self.activepieces_user_id}"
|
return f"{self.user.email} -> {self.activepieces_user_id}"
|
||||||
|
|
||||||
|
|
||||||
|
class TenantDefaultFlow(models.Model):
|
||||||
|
"""
|
||||||
|
Tracks default automation flows provisioned for a tenant.
|
||||||
|
|
||||||
|
Used for:
|
||||||
|
1. Identifying which flows are "default" vs user-created
|
||||||
|
2. Enabling restore to default functionality
|
||||||
|
3. Tracking if user has modified the flow
|
||||||
|
"""
|
||||||
|
|
||||||
|
class FlowType(models.TextChoices):
|
||||||
|
APPOINTMENT_CONFIRMATION = "appointment_confirmation", "Appointment Confirmation Email"
|
||||||
|
APPOINTMENT_REMINDER = "appointment_reminder", "Appointment Reminder"
|
||||||
|
THANK_YOU_EMAIL = "thank_you", "Thank You Email (After Payment)"
|
||||||
|
PAYMENT_DEPOSIT = "payment_deposit", "Deposit Payment Confirmation"
|
||||||
|
PAYMENT_FINAL = "payment_final", "Final Payment Confirmation"
|
||||||
|
|
||||||
|
tenant = models.ForeignKey(
|
||||||
|
"core.Tenant",
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name="default_flows",
|
||||||
|
help_text="The tenant this default flow belongs to",
|
||||||
|
)
|
||||||
|
flow_type = models.CharField(
|
||||||
|
max_length=50,
|
||||||
|
choices=FlowType.choices,
|
||||||
|
help_text="Type of default flow",
|
||||||
|
)
|
||||||
|
activepieces_flow_id = models.CharField(
|
||||||
|
max_length=255,
|
||||||
|
help_text="The Activepieces flow ID",
|
||||||
|
)
|
||||||
|
is_modified = models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
help_text="Whether user has modified this flow from default",
|
||||||
|
)
|
||||||
|
default_flow_json = models.JSONField(
|
||||||
|
help_text="The original default flow definition for restore",
|
||||||
|
)
|
||||||
|
version = models.CharField(
|
||||||
|
max_length=20,
|
||||||
|
default="1.0.0",
|
||||||
|
help_text="Version of the default flow template",
|
||||||
|
)
|
||||||
|
is_enabled = models.BooleanField(
|
||||||
|
default=True,
|
||||||
|
help_text="Whether this flow is enabled in Activepieces",
|
||||||
|
)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
app_label = "activepieces"
|
||||||
|
verbose_name = "Tenant Default Flow"
|
||||||
|
verbose_name_plural = "Tenant Default Flows"
|
||||||
|
unique_together = ["tenant", "flow_type"]
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=["tenant", "flow_type"]),
|
||||||
|
models.Index(fields=["activepieces_flow_id"]),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"{self.tenant.name} - {self.get_flow_type_display()}"
|
||||||
|
|||||||
@@ -35,6 +35,8 @@ ACTIVEPIECES_API_SCOPES = [
|
|||||||
"customers:read",
|
"customers:read",
|
||||||
"customers:write",
|
"customers:write",
|
||||||
"business:read",
|
"business:read",
|
||||||
|
"emails:read",
|
||||||
|
"emails:write",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
@@ -256,6 +258,7 @@ class ActivepiecesClient:
|
|||||||
|
|
||||||
# Build the connection upsert request
|
# Build the connection upsert request
|
||||||
# This uses Activepieces' app-connection API
|
# This uses Activepieces' app-connection API
|
||||||
|
# The 'protected' metadata prevents users from deleting this connection
|
||||||
connection_data = {
|
connection_data = {
|
||||||
"externalId": f"smoothschedule-{tenant.schema_name}",
|
"externalId": f"smoothschedule-{tenant.schema_name}",
|
||||||
"displayName": f"SmoothSchedule ({tenant.name})",
|
"displayName": f"SmoothSchedule ({tenant.name})",
|
||||||
@@ -270,6 +273,12 @@ class ActivepiecesClient:
|
|||||||
"subdomain": tenant.schema_name,
|
"subdomain": tenant.schema_name,
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
"metadata": {
|
||||||
|
"protected": True,
|
||||||
|
"autoSelect": True,
|
||||||
|
"source": "auto-provisioned",
|
||||||
|
"description": "Auto-created connection for SmoothSchedule integration. Cannot be deleted.",
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -307,6 +316,204 @@ class ActivepiecesClient:
|
|||||||
)
|
)
|
||||||
return result.get("data", [])
|
return result.get("data", [])
|
||||||
|
|
||||||
|
def create_flow(
|
||||||
|
self,
|
||||||
|
project_id: str,
|
||||||
|
token: str,
|
||||||
|
flow_data: dict,
|
||||||
|
folder_name: Optional[str] = None,
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Create a new flow in Activepieces.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: The Activepieces project ID
|
||||||
|
token: Session token for API calls
|
||||||
|
flow_data: Flow definition including displayName and trigger
|
||||||
|
folder_name: Optional folder name to create/use for this flow
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Created flow object with id
|
||||||
|
"""
|
||||||
|
# Create the flow shell
|
||||||
|
create_data = {
|
||||||
|
"displayName": flow_data.get("displayName", "Untitled"),
|
||||||
|
"projectId": project_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add folder if specified
|
||||||
|
if folder_name:
|
||||||
|
create_data["folderName"] = folder_name
|
||||||
|
|
||||||
|
result = self._request(
|
||||||
|
"POST",
|
||||||
|
"/api/v1/flows",
|
||||||
|
data=create_data,
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
|
flow_id = result.get("id")
|
||||||
|
|
||||||
|
# Apply the full flow definition with trigger and actions
|
||||||
|
if flow_id and flow_data.get("trigger"):
|
||||||
|
self.import_flow(
|
||||||
|
flow_id=flow_id,
|
||||||
|
token=token,
|
||||||
|
display_name=flow_data.get("displayName", "Untitled"),
|
||||||
|
trigger=flow_data["trigger"],
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def import_flow(
|
||||||
|
self, flow_id: str, token: str, display_name: str, trigger: dict
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Import/update a flow with a trigger and actions structure.
|
||||||
|
|
||||||
|
Uses the IMPORT_FLOW operation to set the complete flow definition.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_id: The Activepieces flow ID
|
||||||
|
token: Session token for API calls
|
||||||
|
display_name: Display name for the flow
|
||||||
|
trigger: The trigger configuration with actions chain
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated flow object
|
||||||
|
"""
|
||||||
|
return self._request(
|
||||||
|
"POST",
|
||||||
|
f"/api/v1/flows/{flow_id}",
|
||||||
|
data={
|
||||||
|
"type": "IMPORT_FLOW",
|
||||||
|
"request": {
|
||||||
|
"displayName": display_name,
|
||||||
|
"trigger": trigger,
|
||||||
|
"schemaVersion": "1",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
|
def update_flow_status(self, flow_id: str, token: str, enabled: bool) -> dict:
|
||||||
|
"""
|
||||||
|
Enable or disable a flow.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_id: The Activepieces flow ID
|
||||||
|
token: Session token for API calls
|
||||||
|
enabled: True to enable, False to disable
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated flow object
|
||||||
|
"""
|
||||||
|
return self._request(
|
||||||
|
"POST",
|
||||||
|
f"/api/v1/flows/{flow_id}",
|
||||||
|
data={
|
||||||
|
"type": "CHANGE_STATUS",
|
||||||
|
"request": {
|
||||||
|
"status": "ENABLED" if enabled else "DISABLED",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
|
def publish_flow(self, flow_id: str, token: str) -> dict:
|
||||||
|
"""
|
||||||
|
Publish a flow (lock and make it live).
|
||||||
|
|
||||||
|
Uses the LOCK_AND_PUBLISH operation which:
|
||||||
|
1. Locks the current version
|
||||||
|
2. Makes it the published (active) version
|
||||||
|
3. Enables the flow
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_id: The Activepieces flow ID
|
||||||
|
token: Session token for API calls
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated flow object
|
||||||
|
"""
|
||||||
|
return self._request(
|
||||||
|
"POST",
|
||||||
|
f"/api/v1/flows/{flow_id}",
|
||||||
|
data={
|
||||||
|
"type": "LOCK_AND_PUBLISH",
|
||||||
|
"request": {},
|
||||||
|
},
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
|
def save_sample_data(
|
||||||
|
self,
|
||||||
|
flow_id: str,
|
||||||
|
token: str,
|
||||||
|
step_name: str,
|
||||||
|
sample_data: dict,
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Save sample data for a flow step (trigger or action).
|
||||||
|
|
||||||
|
This populates the sample data so users can see example
|
||||||
|
output and use it in subsequent steps.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_id: The Activepieces flow ID
|
||||||
|
token: Session token for API calls
|
||||||
|
step_name: Name of the step (e.g., "trigger")
|
||||||
|
sample_data: Sample data to save
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Updated flow object
|
||||||
|
"""
|
||||||
|
return self._request(
|
||||||
|
"POST",
|
||||||
|
f"/api/v1/flows/{flow_id}",
|
||||||
|
data={
|
||||||
|
"type": "SAVE_SAMPLE_DATA",
|
||||||
|
"request": {
|
||||||
|
"stepName": step_name,
|
||||||
|
"payload": sample_data,
|
||||||
|
"type": "OUTPUT", # SampleDataFileType.OUTPUT
|
||||||
|
"dataType": "JSON", # SampleDataDataType.JSON
|
||||||
|
},
|
||||||
|
},
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_flow(self, flow_id: str, token: str) -> dict:
|
||||||
|
"""
|
||||||
|
Get a flow by ID.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_id: The Activepieces flow ID
|
||||||
|
token: Session token for API calls
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Flow object
|
||||||
|
"""
|
||||||
|
return self._request(
|
||||||
|
"GET",
|
||||||
|
f"/api/v1/flows/{flow_id}",
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
|
def delete_flow(self, flow_id: str, token: str) -> None:
|
||||||
|
"""
|
||||||
|
Delete a flow.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
flow_id: The Activepieces flow ID
|
||||||
|
token: Session token for API calls
|
||||||
|
"""
|
||||||
|
self._request(
|
||||||
|
"DELETE",
|
||||||
|
f"/api/v1/flows/{flow_id}",
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
|
||||||
def trigger_webhook(self, webhook_url: str, payload: dict) -> dict:
|
def trigger_webhook(self, webhook_url: str, payload: dict) -> dict:
|
||||||
"""
|
"""
|
||||||
Trigger a flow via its webhook URL.
|
Trigger a flow via its webhook URL.
|
||||||
@@ -339,12 +546,101 @@ class ActivepiecesClient:
|
|||||||
"""
|
"""
|
||||||
return self.embed_url
|
return self.embed_url
|
||||||
|
|
||||||
|
def get_session_token(self, tenant) -> tuple[str, str]:
|
||||||
|
"""
|
||||||
|
Get an Activepieces session token for API calls.
|
||||||
|
|
||||||
|
Unlike get_embed_session which returns a trust token for the frontend,
|
||||||
|
this returns the actual Activepieces session token needed for API calls.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tenant: The SmoothSchedule tenant
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (session_token, project_id)
|
||||||
|
"""
|
||||||
|
provisioning_token = self._generate_trust_token(tenant)
|
||||||
|
result = self._request(
|
||||||
|
"POST",
|
||||||
|
"/api/v1/authentication/django-trust",
|
||||||
|
data={"token": provisioning_token},
|
||||||
|
)
|
||||||
|
|
||||||
|
session_token = result.get("token")
|
||||||
|
project_id = result.get("projectId")
|
||||||
|
|
||||||
|
if not session_token:
|
||||||
|
raise ActivepiecesError("Failed to get Activepieces session token")
|
||||||
|
|
||||||
|
return session_token, project_id
|
||||||
|
|
||||||
|
|
||||||
def get_activepieces_client() -> ActivepiecesClient:
|
def get_activepieces_client() -> ActivepiecesClient:
|
||||||
"""Factory function to get an Activepieces client instance."""
|
"""Factory function to get an Activepieces client instance."""
|
||||||
return ActivepiecesClient()
|
return ActivepiecesClient()
|
||||||
|
|
||||||
|
|
||||||
|
def provision_tenant_connection(tenant) -> bool:
|
||||||
|
"""
|
||||||
|
Provision SmoothSchedule connection for a tenant in Activepieces.
|
||||||
|
|
||||||
|
This creates the Activepieces project (if needed) and provisions
|
||||||
|
a protected SmoothSchedule connection so users can immediately
|
||||||
|
use SmoothSchedule triggers and actions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tenant: The SmoothSchedule tenant (Client model)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if connection was successfully provisioned, False otherwise
|
||||||
|
"""
|
||||||
|
from .models import TenantActivepiecesProject
|
||||||
|
|
||||||
|
client = get_activepieces_client()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get or create the Activepieces project for this tenant
|
||||||
|
# by exchanging a trust token
|
||||||
|
provisioning_token = client._generate_trust_token(tenant)
|
||||||
|
result = client._request(
|
||||||
|
"POST",
|
||||||
|
"/api/v1/authentication/django-trust",
|
||||||
|
data={"token": provisioning_token},
|
||||||
|
)
|
||||||
|
|
||||||
|
session_token = result.get("token")
|
||||||
|
project_id = result.get("projectId")
|
||||||
|
|
||||||
|
if not session_token or not project_id:
|
||||||
|
logger.error(
|
||||||
|
f"Failed to get Activepieces session for tenant {tenant.id}: "
|
||||||
|
"missing token or projectId"
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Store the project mapping
|
||||||
|
TenantActivepiecesProject.objects.update_or_create(
|
||||||
|
tenant=tenant,
|
||||||
|
defaults={
|
||||||
|
"activepieces_project_id": project_id,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Provision the protected SmoothSchedule connection
|
||||||
|
client._provision_smoothschedule_connection(tenant, session_token, project_id)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Successfully provisioned SmoothSchedule connection for tenant {tenant.id}"
|
||||||
|
)
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f"Failed to provision SmoothSchedule connection for tenant {tenant.id}: {e}"
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
def dispatch_event_webhook(tenant, event_type: str, payload: dict) -> None:
|
def dispatch_event_webhook(tenant, event_type: str, payload: dict) -> None:
|
||||||
"""
|
"""
|
||||||
Dispatch a SmoothSchedule event to Activepieces webhooks.
|
Dispatch a SmoothSchedule event to Activepieces webhooks.
|
||||||
|
|||||||
@@ -103,13 +103,21 @@ class TestActivepiecesClient:
|
|||||||
|
|
||||||
result = client.get_embed_session(mock_tenant)
|
result = client.get_embed_session(mock_tenant)
|
||||||
|
|
||||||
assert result["token"] == "ap-session-token"
|
# The token returned is a frontend_token (JWT generated by _generate_trust_token)
|
||||||
|
# not the session token from the API. It should be a valid JWT.
|
||||||
|
assert "token" in result
|
||||||
|
# Verify it's a JWT (should have 3 parts separated by dots)
|
||||||
|
assert result["token"].count('.') == 2
|
||||||
|
# Verify other fields
|
||||||
assert result["projectId"] == "project-123"
|
assert result["projectId"] == "project-123"
|
||||||
assert result["embedUrl"] == "http://localhost:8090"
|
assert result["embedUrl"] == "http://localhost:8090"
|
||||||
|
|
||||||
# Verify the request was made to the django-trust endpoint
|
# Verify the request was made to the django-trust endpoint
|
||||||
|
mock_requests.request.assert_called()
|
||||||
call_args = mock_requests.request.call_args
|
call_args = mock_requests.request.call_args
|
||||||
assert "/api/v1/authentication/django-trust" in call_args.kwargs.get("url", call_args[1].get("url", ""))
|
# Access the 'url' keyword argument properly
|
||||||
|
url = call_args.kwargs.get("url") or (call_args[1].get("url") if len(call_args) > 1 else "")
|
||||||
|
assert "/api/v1/authentication/django-trust" in url
|
||||||
|
|
||||||
@patch("smoothschedule.integrations.activepieces.services.requests")
|
@patch("smoothschedule.integrations.activepieces.services.requests")
|
||||||
def test_get_embed_session_error(self, mock_requests):
|
def test_get_embed_session_error(self, mock_requests):
|
||||||
|
|||||||
@@ -9,6 +9,9 @@ from .views import (
|
|||||||
ActivepiecesFlowsView,
|
ActivepiecesFlowsView,
|
||||||
ActivepiecesHealthView,
|
ActivepiecesHealthView,
|
||||||
ActivepiecesWebhookView,
|
ActivepiecesWebhookView,
|
||||||
|
DefaultFlowsListView,
|
||||||
|
DefaultFlowRestoreView,
|
||||||
|
DefaultFlowsRestoreAllView,
|
||||||
)
|
)
|
||||||
|
|
||||||
app_name = "activepieces"
|
app_name = "activepieces"
|
||||||
@@ -38,4 +41,20 @@ urlpatterns = [
|
|||||||
ActivepiecesHealthView.as_view(),
|
ActivepiecesHealthView.as_view(),
|
||||||
name="health",
|
name="health",
|
||||||
),
|
),
|
||||||
|
# Default flows management
|
||||||
|
path(
|
||||||
|
"default-flows/",
|
||||||
|
DefaultFlowsListView.as_view(),
|
||||||
|
name="default-flows-list",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"default-flows/<str:flow_type>/restore/",
|
||||||
|
DefaultFlowRestoreView.as_view(),
|
||||||
|
name="default-flow-restore",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"default-flows/restore-all/",
|
||||||
|
DefaultFlowsRestoreAllView.as_view(),
|
||||||
|
name="default-flows-restore-all",
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
|||||||
@@ -12,8 +12,9 @@ from rest_framework.views import APIView
|
|||||||
|
|
||||||
from smoothschedule.identity.core.mixins import TenantRequiredAPIView
|
from smoothschedule.identity.core.mixins import TenantRequiredAPIView
|
||||||
|
|
||||||
from .models import TenantActivepiecesProject
|
from .models import TenantActivepiecesProject, TenantDefaultFlow
|
||||||
from .services import ActivepiecesError, get_activepieces_client
|
from .services import ActivepiecesError, get_activepieces_client
|
||||||
|
from .default_flows import get_flow_definition, get_sample_data_for_flow, FLOW_VERSION
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -194,3 +195,324 @@ class ActivepiecesHealthView(APIView):
|
|||||||
{"status": "unhealthy", "error": str(e)},
|
{"status": "unhealthy", "error": str(e)},
|
||||||
status=status.HTTP_503_SERVICE_UNAVAILABLE,
|
status=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DefaultFlowsListView(TenantRequiredAPIView, APIView):
|
||||||
|
"""
|
||||||
|
List default automation flows for the current tenant.
|
||||||
|
|
||||||
|
GET /api/activepieces/default-flows/
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"flows": [
|
||||||
|
{
|
||||||
|
"flow_type": "appointment_confirmation",
|
||||||
|
"display_name": "Appointment Confirmation Email",
|
||||||
|
"activepieces_flow_id": "...",
|
||||||
|
"is_modified": false,
|
||||||
|
"is_enabled": true,
|
||||||
|
"version": "1.0.0"
|
||||||
|
},
|
||||||
|
...
|
||||||
|
]
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
tenant = self.tenant
|
||||||
|
|
||||||
|
flows = TenantDefaultFlow.objects.filter(tenant=tenant)
|
||||||
|
|
||||||
|
flow_data = [
|
||||||
|
{
|
||||||
|
"flow_type": flow.flow_type,
|
||||||
|
"display_name": flow.get_flow_type_display(),
|
||||||
|
"activepieces_flow_id": flow.activepieces_flow_id,
|
||||||
|
"is_modified": flow.is_modified,
|
||||||
|
"is_enabled": flow.is_enabled,
|
||||||
|
"version": flow.version,
|
||||||
|
"created_at": flow.created_at.isoformat(),
|
||||||
|
"updated_at": flow.updated_at.isoformat(),
|
||||||
|
}
|
||||||
|
for flow in flows
|
||||||
|
]
|
||||||
|
|
||||||
|
return self.success_response({"flows": flow_data})
|
||||||
|
|
||||||
|
|
||||||
|
class DefaultFlowRestoreView(TenantRequiredAPIView, APIView):
|
||||||
|
"""
|
||||||
|
Restore a default flow to its original definition.
|
||||||
|
|
||||||
|
POST /api/activepieces/default-flows/<flow_type>/restore/
|
||||||
|
|
||||||
|
This updates the existing flow in Activepieces with the original
|
||||||
|
default definition, preserving the flow ID.
|
||||||
|
|
||||||
|
URL params:
|
||||||
|
flow_type: One of appointment_confirmation, appointment_reminder,
|
||||||
|
thank_you, payment_deposit, payment_final
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"flow_type": "appointment_confirmation",
|
||||||
|
"message": "Flow restored to default"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def post(self, request, flow_type):
|
||||||
|
tenant = self.tenant
|
||||||
|
|
||||||
|
# Validate flow type
|
||||||
|
valid_types = [choice[0] for choice in TenantDefaultFlow.FlowType.choices]
|
||||||
|
if flow_type not in valid_types:
|
||||||
|
return self.error_response(
|
||||||
|
f"Invalid flow type. Must be one of: {', '.join(valid_types)}",
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get the default flow record
|
||||||
|
try:
|
||||||
|
default_flow = TenantDefaultFlow.objects.get(
|
||||||
|
tenant=tenant, flow_type=flow_type
|
||||||
|
)
|
||||||
|
except TenantDefaultFlow.DoesNotExist:
|
||||||
|
return self.error_response(
|
||||||
|
f"Default flow '{flow_type}' not found for this tenant",
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
)
|
||||||
|
|
||||||
|
client = get_activepieces_client()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get session token for API calls (not trust token for frontend)
|
||||||
|
token, project_id = client.get_session_token(tenant)
|
||||||
|
|
||||||
|
if not token:
|
||||||
|
return self.error_response(
|
||||||
|
"Failed to get Activepieces session",
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get the original flow definition
|
||||||
|
flow_def = get_flow_definition(flow_type)
|
||||||
|
|
||||||
|
# Try to update the existing flow, or create a new one if it doesn't exist
|
||||||
|
try:
|
||||||
|
client.import_flow(
|
||||||
|
flow_id=default_flow.activepieces_flow_id,
|
||||||
|
token=token,
|
||||||
|
display_name=flow_def.get("displayName", flow_type),
|
||||||
|
trigger=flow_def.get("trigger"),
|
||||||
|
)
|
||||||
|
new_flow_id = default_flow.activepieces_flow_id
|
||||||
|
except ActivepiecesError as e:
|
||||||
|
# Flow doesn't exist in Activepieces (e.g., after container rebuild)
|
||||||
|
# Create a new one
|
||||||
|
if "404" in str(e):
|
||||||
|
logger.warning(
|
||||||
|
f"Flow {default_flow.activepieces_flow_id} not found in Activepieces, "
|
||||||
|
f"creating new flow for {flow_type}"
|
||||||
|
)
|
||||||
|
created_flow = client.create_flow(
|
||||||
|
project_id=project_id,
|
||||||
|
token=token,
|
||||||
|
flow_data={
|
||||||
|
"displayName": flow_def.get("displayName", flow_type),
|
||||||
|
"trigger": flow_def.get("trigger"),
|
||||||
|
},
|
||||||
|
folder_name="Defaults",
|
||||||
|
)
|
||||||
|
new_flow_id = created_flow.get("id")
|
||||||
|
if not new_flow_id:
|
||||||
|
raise ActivepiecesError("Failed to create replacement flow")
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
|
# Save sample data for the trigger
|
||||||
|
sample_data = get_sample_data_for_flow(flow_type)
|
||||||
|
if sample_data:
|
||||||
|
try:
|
||||||
|
client.save_sample_data(
|
||||||
|
flow_id=new_flow_id,
|
||||||
|
token=token,
|
||||||
|
step_name="trigger",
|
||||||
|
sample_data=sample_data,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Failed to save sample data for flow {flow_type}: {e}")
|
||||||
|
|
||||||
|
# Publish the flow (locks version and enables)
|
||||||
|
try:
|
||||||
|
client.publish_flow(new_flow_id, token)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Failed to publish flow {flow_type}, enabling instead: {e}")
|
||||||
|
# Fallback to just enabling
|
||||||
|
client.update_flow_status(new_flow_id, token, enabled=True)
|
||||||
|
|
||||||
|
# Update the Django record
|
||||||
|
default_flow.activepieces_flow_id = new_flow_id
|
||||||
|
default_flow.is_modified = False
|
||||||
|
default_flow.default_flow_json = flow_def
|
||||||
|
default_flow.version = FLOW_VERSION
|
||||||
|
default_flow.save()
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Restored default flow {flow_type} for tenant {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return self.success_response({
|
||||||
|
"success": True,
|
||||||
|
"flow_type": flow_type,
|
||||||
|
"message": "Flow restored to default",
|
||||||
|
})
|
||||||
|
|
||||||
|
except ActivepiecesError as e:
|
||||||
|
logger.error(f"Failed to restore flow: {e}")
|
||||||
|
return self.error_response(
|
||||||
|
"Failed to restore flow in Activepieces",
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DefaultFlowsRestoreAllView(TenantRequiredAPIView, APIView):
|
||||||
|
"""
|
||||||
|
Restore all default flows to their original definitions.
|
||||||
|
|
||||||
|
POST /api/activepieces/default-flows/restore-all/
|
||||||
|
|
||||||
|
This restores all default flows for the tenant, preserving flow IDs.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"restored": ["appointment_confirmation", "appointment_reminder", ...],
|
||||||
|
"failed": [],
|
||||||
|
"message": "Restored 5 flows"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def post(self, request):
|
||||||
|
tenant = self.tenant
|
||||||
|
|
||||||
|
client = get_activepieces_client()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get session token for API calls (not trust token for frontend)
|
||||||
|
token, project_id = client.get_session_token(tenant)
|
||||||
|
|
||||||
|
if not token:
|
||||||
|
return self.error_response(
|
||||||
|
"Failed to get Activepieces session",
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
)
|
||||||
|
|
||||||
|
restored = []
|
||||||
|
failed = []
|
||||||
|
|
||||||
|
# Get all default flows for this tenant
|
||||||
|
default_flows = TenantDefaultFlow.objects.filter(tenant=tenant)
|
||||||
|
|
||||||
|
for default_flow in default_flows:
|
||||||
|
try:
|
||||||
|
# Get the original flow definition
|
||||||
|
flow_def = get_flow_definition(default_flow.flow_type)
|
||||||
|
|
||||||
|
# Try to update the existing flow, or create a new one if it doesn't exist
|
||||||
|
try:
|
||||||
|
client.import_flow(
|
||||||
|
flow_id=default_flow.activepieces_flow_id,
|
||||||
|
token=token,
|
||||||
|
display_name=flow_def.get("displayName", default_flow.flow_type),
|
||||||
|
trigger=flow_def.get("trigger"),
|
||||||
|
)
|
||||||
|
new_flow_id = default_flow.activepieces_flow_id
|
||||||
|
except ActivepiecesError as e:
|
||||||
|
# Flow doesn't exist in Activepieces (e.g., after container rebuild)
|
||||||
|
# Create a new one
|
||||||
|
if "404" in str(e):
|
||||||
|
logger.warning(
|
||||||
|
f"Flow {default_flow.activepieces_flow_id} not found in Activepieces, "
|
||||||
|
f"creating new flow for {default_flow.flow_type}"
|
||||||
|
)
|
||||||
|
created_flow = client.create_flow(
|
||||||
|
project_id=project_id,
|
||||||
|
token=token,
|
||||||
|
flow_data={
|
||||||
|
"displayName": flow_def.get("displayName", default_flow.flow_type),
|
||||||
|
"trigger": flow_def.get("trigger"),
|
||||||
|
},
|
||||||
|
folder_name="Defaults",
|
||||||
|
)
|
||||||
|
new_flow_id = created_flow.get("id")
|
||||||
|
if not new_flow_id:
|
||||||
|
raise ActivepiecesError("Failed to create replacement flow")
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
|
# Save sample data for the trigger
|
||||||
|
sample_data = get_sample_data_for_flow(default_flow.flow_type)
|
||||||
|
if sample_data:
|
||||||
|
try:
|
||||||
|
client.save_sample_data(
|
||||||
|
flow_id=new_flow_id,
|
||||||
|
token=token,
|
||||||
|
step_name="trigger",
|
||||||
|
sample_data=sample_data,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(
|
||||||
|
f"Failed to save sample data for flow {default_flow.flow_type}: {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Publish the flow (locks version and enables)
|
||||||
|
try:
|
||||||
|
client.publish_flow(new_flow_id, token)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(
|
||||||
|
f"Failed to publish flow {default_flow.flow_type}, enabling instead: {e}"
|
||||||
|
)
|
||||||
|
# Fallback to just enabling
|
||||||
|
client.update_flow_status(new_flow_id, token, enabled=True)
|
||||||
|
|
||||||
|
# Update the Django record
|
||||||
|
default_flow.activepieces_flow_id = new_flow_id
|
||||||
|
default_flow.is_modified = False
|
||||||
|
default_flow.default_flow_json = flow_def
|
||||||
|
default_flow.version = FLOW_VERSION
|
||||||
|
default_flow.save()
|
||||||
|
|
||||||
|
restored.append(default_flow.flow_type)
|
||||||
|
logger.info(
|
||||||
|
f"Restored default flow {default_flow.flow_type} for tenant {tenant.schema_name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
failed.append(default_flow.flow_type)
|
||||||
|
logger.error(
|
||||||
|
f"Failed to restore flow {default_flow.flow_type}: {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return self.success_response({
|
||||||
|
"success": len(failed) == 0,
|
||||||
|
"restored": restored,
|
||||||
|
"failed": failed,
|
||||||
|
"message": f"Restored {len(restored)} flows" + (
|
||||||
|
f", {len(failed)} failed" if failed else ""
|
||||||
|
),
|
||||||
|
})
|
||||||
|
|
||||||
|
except ActivepiecesError as e:
|
||||||
|
logger.error(f"Failed to restore flows: {e}")
|
||||||
|
return self.error_response(
|
||||||
|
"Failed to restore flows in Activepieces",
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
)
|
||||||
|
|||||||
@@ -32,6 +32,8 @@ class APIScope:
|
|||||||
CUSTOMERS_WRITE = 'customers:write'
|
CUSTOMERS_WRITE = 'customers:write'
|
||||||
BUSINESS_READ = 'business:read'
|
BUSINESS_READ = 'business:read'
|
||||||
WEBHOOKS_MANAGE = 'webhooks:manage'
|
WEBHOOKS_MANAGE = 'webhooks:manage'
|
||||||
|
EMAILS_READ = 'emails:read'
|
||||||
|
EMAILS_WRITE = 'emails:write'
|
||||||
|
|
||||||
CHOICES = [
|
CHOICES = [
|
||||||
(SERVICES_READ, 'View services and pricing'),
|
(SERVICES_READ, 'View services and pricing'),
|
||||||
@@ -43,6 +45,8 @@ class APIScope:
|
|||||||
(CUSTOMERS_WRITE, 'Create and update customers'),
|
(CUSTOMERS_WRITE, 'Create and update customers'),
|
||||||
(BUSINESS_READ, 'View business information'),
|
(BUSINESS_READ, 'View business information'),
|
||||||
(WEBHOOKS_MANAGE, 'Manage webhook subscriptions'),
|
(WEBHOOKS_MANAGE, 'Manage webhook subscriptions'),
|
||||||
|
(EMAILS_READ, 'View email templates'),
|
||||||
|
(EMAILS_WRITE, 'Send emails using templates'),
|
||||||
]
|
]
|
||||||
|
|
||||||
ALL_SCOPES = [choice[0] for choice in CHOICES]
|
ALL_SCOPES = [choice[0] for choice in CHOICES]
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user