Introduction
Implementing a “Download PDF Report” feature with user data collection in a Docusaurus website presents unique challenges, especially when integrating with external services like Notion. This article documents our journey building this functionality for the FemTech Weekend website, covering the obstacles we faced and the solutions we implemented. By the end, you’ll understand how to create a PDF download component that collects user information and stores it in a Notion database, working seamlessly in both local development and Vercel deployment.
Project Overview
Our goal was to create a component that would:
- Display a “Download Full PDF Report” button on blog posts
- Show a form modal when clicked to collect user information
- Submit this information to a Notion database
- Allow the user to download the PDF after form submission
- Work in both development and production environments
The Architecture
The final solution involved several key components:
- A
DownloadPdfButton
React component - An API route handler in
src/api/pdf-form-submit.js
- A forwarding API endpoint in
api/pdf-form-submit/index.js
- A custom Docusaurus plugin for API route handling
- Development scripts to run both Docusaurus and API servers
Challenge 1: Understanding the Docusaurus API Route System
The Problem
Unlike Next.js, Docusaurus doesn’t have a built-in API routes system. Our initial attempts to create API endpoints failed because we didn’t understand how API routes are handled in Docusaurus.
The Solution
We discovered that Docusaurus requires a custom plugin to handle API routes. This plugin sets up a proxy in development mode that forwards API requests to a separate Node.js server running on port 3001.
// src/plugins/api-routes.js
module.exports = function apiRoutesPlugin(context, options) {
return {
name: 'docusaurus-api-routes-plugin',
configureWebpack(config, isServer) {
if (!isServer && process.env.NODE_ENV === 'development') {
return {
devServer: {
proxy: {
'/api': {
target: 'http://localhost:3001',
secure: false,
changeOrigin: true,
// Additional configuration...
},
},
},
};
}
return {};
},
// Additional plugin configuration...
};
};
This plugin must be registered in docusaurus.config.ts
to work properly.
Challenge 2: CORS and Request Handling
The Problem
Even after setting up the API routes plugin, we encountered CORS errors when submitting the form, particularly in the development environment.
The Solution
We added proper CORS headers to our API server configuration and ensured that the proxy settings correctly handled both OPTIONS preflight requests and actual POST requests:
// In the plugin's proxy configuration
onProxyRes: (proxyRes, req, res) => {
// Add CORS headers to the response
proxyRes.headers['Access-Control-Allow-Origin'] = '*';
proxyRes.headers['Access-Control-Allow-Methods'] = 'GET, POST, OPTIONS';
proxyRes.headers['Access-Control-Allow-Headers'] = 'Content-Type';
},
Challenge 3: API Server Configuration
The Problem
We needed a separate API server to handle requests when in development mode, but weren’t sure how to structure it properly.
The Solution
We created an api-server.js
file that:
- Sets up a Node.js HTTP server on port 3001
- Loads environment variables from
.env.local
- Routes API requests to the appropriate handlers in the
src/api
directory - Provides proper error handling and logging
For Vercel deployment, we created a parallel structure in the api
folder that forwards requests to the main implementation.
Challenge 4: Notion Database Integration
The Problem
Connecting to Notion and storing form submissions required specific configuration and error handling to work reliably.
The Solution
We implemented a comprehensive Notion integration in our pdf-form-submit.js
handler:
// src/api/pdf-form-submit.js
const { Client } = require('@notionhq/client');
// Initialize Notion client
const notion = new Client({
auth: process.env.NOTION_TOKEN,
});
async function handler(req, res) {
// Validation and error handling
// Create Notion page properties
const properties = {
Name: {
title: [{ text: { content: fullName } }],
},
// Additional properties...
};
// Create the page in Notion
const response = await notion.pages.create({
parent: { database_id: process.env.PDF_FORM_DATABASE_ID },
properties: properties,
});
// Return success response
}
Challenge 5: Development Environment Setup
The Problem
Running both the Docusaurus server and the API server simultaneously was cumbersome and error-prone.
The Solution
We created a custom start-dev.js
script that:
- Checks if required environment variables are present
- Creates a default
.env.local
if it doesn’t exist - Checks if necessary ports are available
- Starts both servers in the correct order
- Provides clear console logging for debugging
// start-dev.js
async function startServers() {
// Start API server first
const apiProcess = spawn('node', ['api-server.js'], {
stdio: 'inherit',
shell: true
});
// Wait a moment for API server to initialize
await new Promise(resolve => setTimeout(resolve, 2000));
// Start Docusaurus server
const docusaurusProcess = spawn('npm', ['run', 'start'], {
stdio: 'inherit',
shell: true,
env: { ...process.env, BROWSER: 'none' }
});
// Handle process errors and shutdown
// ...
}
Challenge 6: Environment Variables
The Problem
Managing environment variables between development and production environments was challenging, especially for the Notion integration.
The Solution
We implemented a system that:
- Uses
.env.local
for local development - Relies on Vercel environment variables for production
- Exposes necessary variables to client-side code through the plugin
// In the plugin's injectHtmlTags method
injectHtmlTags() {
return {
headTags: [
{
tagName: 'script',
innerHTML: `
window.ENV = {
NOTION_DATABASE_ID: '${process.env.NOTION_DATABASE_ID || ''}',
// Other variables...
};
`,
},
],
};
},
The Final Implementation
Component Structure
Our final DownloadPdfButton
component:
- Shows a button that triggers a modal form
- Collects user information
- Submits the form to our API endpoint
- Provides feedback on submission status
- Opens the PDF link after successful submission
API Handler Structure
The API handler follows these steps:
- Validates incoming request data
- Formats the data for Notion
- Creates a new page in the Notion database
- Returns appropriate success/error responses
Development Setup
To run the project locally:
- Create a
.env.local
file with required Notion credentials - Run
node start-dev.js
to start both servers - The Docusaurus site runs on port 3000, while the API server runs on port 3001
Production Deployment
For Vercel deployment:
- Configure environment variables in the Vercel project settings
- Vercel’s serverless functions automatically handle the API routes
- The same code works in both environments thanks to our API forwarding structure
Key Lessons Learned
- Understand the Platform: Docusaurus handles API routes differently than frameworks like Next.js
- Dual Server Architecture: Running separate servers for frontend and API in development provides flexibility
- Error Logging: Comprehensive logging helps identify issues quickly
- Environment Consistency: Ensuring development and production environments are configured similarly prevents deployment surprises
- API Forwarding: Creating a forwarding structure ensures the same code works in both environments
Conclusion
Building a PDF download component with Notion integration in Docusaurus requires understanding both the limitations of the platform and how to work around them. By setting up a custom plugin, implementing proper API handlers, and ensuring consistent environment configuration, we successfully created a solution that works both locally and in production.
This approach can be extended to other external integrations beyond Notion, making it a valuable pattern for any Docusaurus project that needs to handle form submissions or other API interactions.
Learn more Building a PDF Download Form with Notion Integration in Docusaurus: A Complete Guide