Automated Publishing Workflow for ListLog - Help Center

Automated Publishing Workflow for ListLog

Set up automated publishing workflows for your ListLog blog using GitHub Actions, GitLab CI, scheduled posts, and content management systems. Streamline your blogging process.

Scheduled Publishing with GitHub Actions

Automatically publish future-dated posts at specified times using GitHub Actions cron jobs:

Create the Workflow

Create .github/workflows/scheduled-deploy.yml:

name: Scheduled Deploy

on:
  schedule:
    # Run every hour at 5 minutes past the hour
    - cron: '5 * * * *'
  workflow_dispatch: # Allow manual trigger

jobs:
  check-and-deploy:
    runs-on: ubuntu-latest
    
    steps:
    - name: Checkout Repository
      uses: actions/checkout@v4

    - name: Setup Node.js
      uses: actions/setup-node@v4
      with:
        node-version: '18'

    - name: Install Dependencies
      run: npm install

    - name: Check for Publishable Posts
      id: check-posts
      run: |
        node scripts/check-scheduled-posts.js

    - name: Build and Deploy
      if: steps.check-posts.outputs.has-new-posts == 'true'
      run: |
        npm run build
        
    - name: Deploy to GitHub Pages
      if: steps.check-posts.outputs.has-new-posts == 'true'
      uses: peaceiris/actions-gh-pages@v3
      with:
        github_token: ${{ secrets.GITHUB_TOKEN }}
        publish_dir: ./dist

Create the Check Script

Create scripts/check-scheduled-posts.js:

const fs = require('fs');
const path = require('path');

// Check if any posts should be published now
function checkScheduledPosts() {
  const postsDir = './posts';
  const now = new Date();
  let hasNewPosts = false;

  if (!fs.existsSync(postsDir)) {
    console.log('No posts directory found');
    return false;
  }

  const postFiles = fs.readdirSync(postsDir)
    .filter(file => file.endsWith('.md'));

  for (const file of postFiles) {
    const content = fs.readFileSync(path.join(postsDir, file), 'utf8');
    const match = content.match(/^---\n(.*?)\n---/s);
    
    if (match) {
      const frontMatter = match[1];
      const dateMatch = frontMatter.match(/^date:\s*(.+)$/m);
      const publishMatch = frontMatter.match(/^published:\s*(.+)$/m);
      
      if (dateMatch) {
        const postDate = new Date(dateMatch[1].trim());
        const isPublished = publishMatch ? 
          publishMatch[1].trim().toLowerCase() === 'true' : true;
        
        // If post date is now or in the past, and not explicitly unpublished
        if (postDate <= now && isPublished) {
          console.log(`Post ready to publish: ${file}`);
          hasNewPosts = true;
        }
      }
    }
  }

  // Set output for GitHub Actions
  console.log(`::set-output name=has-new-posts::${hasNewPosts}`);
  return hasNewPosts;
}

checkScheduledPosts();
How it works: The workflow runs every hour, checks for posts with dates in the past, and rebuilds your site if new content should be published.

Draft and Publishing States

Manage post visibility with front matter flags:

Post States

Front Matter Status Behavior
published: true Published Post appears on site
published: false Draft Post hidden from public
date: 2025-06-01 (future) Scheduled Auto-publishes on date
No published field Published Default behavior

Example Draft Post

---
title: My Future Post
date: 2025-06-15
author: Your Name
published: false
tags: upcoming, draft
excerpt: This post will be published later
---

# Coming Soon

This content is still being worked on...

Update Build Script for Drafts

Modify your build.js to respect the published flag:

// In your generatePost function, add this check:
const isPublished = metadata.published !== 'false';
const postDate = new Date(metadata.date || new Date());
const now = new Date();

// Only include posts that are published and not in the future
if (!isPublished || postDate > now) {
  console.log(`Skipping: ${post.title} (not ready for publication)`);
  return null; // Skip this post
}

Headless CMS Integration

Connect ListLog to headless CMS platforms for easier content management:

Forestry.io Integration

  1. Connect your repository to Forestry.io
  2. Configure front matter templates for consistent post structure
  3. Set up media management for images and files
  4. Enable auto-deployment when content changes

Netlify CMS Setup

Create admin/config.yml for a web-based editor:

backend:
  name: git-gateway
  branch: main

media_folder: "assets/images"
public_folder: "/assets/images"

collections:
  - name: "posts"
    label: "Blog Posts"
    folder: "posts"
    create: true
    slug: "{{year}}-{{month}}-{{day}}-{{slug}}"
    fields:
      - {label: "Title", name: "title", widget: "string"}
      - {label: "Date", name: "date", widget: "datetime"}
      - {label: "Author", name: "author", widget: "string"}
      - {label: "Tags", name: "tags", widget: "string"}
      - {label: "Excerpt", name: "excerpt", widget: "text"}
      - {label: "Published", name: "published", widget: "boolean", default: true}
      - {label: "Body", name: "body", widget: "markdown"}

Contentful Integration

Use Contentful's API to fetch content and generate markdown files:

// scripts/sync-contentful.js
const contentful = require('contentful');
const fs = require('fs');

const client = contentful.createClient({
  space: 'your-space-id',
  accessToken: 'your-access-token'
});

async function syncPosts() {
  const entries = await client.getEntries({
    content_type: 'blogPost'
  });

  entries.items.forEach(entry => {
    const post = entry.fields;
    const filename = `posts/${post.slug}.md`;
    
    const frontMatter = `---
title: ${post.title}
date: ${post.publishDate}
author: ${post.author}
tags: ${post.tags.join(', ')}
excerpt: ${post.excerpt}
---

${post.content}`;

    fs.writeFileSync(filename, frontMatter);
  });
}

syncPosts();

Webhook-Based Publishing

Trigger builds automatically when content changes in external systems:

GitHub Webhook Setup

  1. Go to repository Settings → Webhooks
  2. Add webhook URL: Your deployment service endpoint
  3. Select events: Push, Pull request
  4. Configure payload: JSON format

Custom Build Trigger

Create an endpoint that accepts webhooks and triggers builds:

// webhook-handler.js (Node.js example)
const express = require('express');
const { exec } = require('child_process');
const app = express();

app.use(express.json());

app.post('/webhook/build', (req, res) => {
  // Verify webhook signature for security
  const signature = req.headers['x-hub-signature-256'];
  
  if (verifySignature(req.body, signature)) {
    // Trigger build process
    exec('npm run build && npm run deploy', (error, stdout, stderr) => {
      if (error) {
        console.error(`Build failed: ${error}`);
        return res.status(500).json({ error: 'Build failed' });
      }
      
      console.log('Build completed successfully');
      res.json({ status: 'Build triggered' });
    });
  } else {
    res.status(401).json({ error: 'Unauthorized' });
  }
});

app.listen(3000, () => {
  console.log('Webhook server running on port 3000');
});

Zapier Integration

Connect external services to your blog publishing workflow:

  • Google Sheets → GitHub: Create posts from spreadsheet rows
  • Notion → Repository: Sync pages as blog posts
  • RSS → Posts: Import content from other sources
  • Email → Draft: Send emails to create draft posts

Social Media Automation

Automatically share new posts on social media platforms:

GitHub Action for Social Sharing

Add to your deployment workflow:

- name: Share on Social Media
  if: steps.deploy.outputs.published == 'true'
  run: |
    node scripts/social-share.js
  env:
    TWITTER_API_KEY: ${{ secrets.TWITTER_API_KEY }}
    LINKEDIN_TOKEN: ${{ secrets.LINKEDIN_TOKEN }}

Social Sharing Script

// scripts/social-share.js
const fs = require('fs');
const Twitter = require('twitter-api-v2');

async function shareLatestPost() {
  // Get the latest published post
  const posts = getLatestPosts(1);
  const post = posts[0];
  
  if (!post) return;
  
  const tweetText = `New blog post: ${post.title}
  
${post.excerpt}

Read more: ${process.env.SITE_URL}/${post.slug}/

#blog #${post.tags.split(',')[0].trim()}`;

  // Share on Twitter
  const twitterClient = new Twitter({
    appKey: process.env.TWITTER_API_KEY,
    appSecret: process.env.TWITTER_API_SECRET,
    accessToken: process.env.TWITTER_ACCESS_TOKEN,
    accessSecret: process.env.TWITTER_ACCESS_SECRET,
  });

  try {
    await twitterClient.v2.tweet(tweetText);
    console.log('Shared on Twitter successfully');
  } catch (error) {
    console.error('Twitter sharing failed:', error);
  }
}

shareLatestPost();

IFTTT Integration

Use IFTTT applets for simple social automation:

  • RSS → Twitter: Auto-tweet new posts from your RSS feed
  • RSS → Facebook: Share to Facebook page
  • RSS → LinkedIn: Post to LinkedIn company page
  • RSS → Discord: Notify community channels

Analytics and Monitoring

Automate analytics tracking and performance monitoring:

Automated Analytics Reports

Generate weekly analytics reports:

// scripts/analytics-report.js
const { google } = require('googleapis');

async function generateWeeklyReport() {
  const analytics = google.analyticsreporting('v4');
  
  const report = await analytics.reports.batchGet({
    auth: auth,
    requestBody: {
      reportRequests: [{
        viewId: 'YOUR_VIEW_ID',
        dateRanges: [{
          startDate: '7daysAgo',
          endDate: 'today'
        }],
        metrics: [
          {expression: 'ga:sessions'},
          {expression: 'ga:pageviews'},
          {expression: 'ga:avgSessionDuration'}
        ],
        dimensions: [{name: 'ga:pagePath'}]
      }]
    }
  });

  // Process and send report via email or Slack
  sendReport(report.data);
}

// Schedule to run weekly
setInterval(generateWeeklyReport, 7 * 24 * 60 * 60 * 1000);

Uptime Monitoring

Monitor your blog's availability:

// scripts/uptime-check.js
const https = require('https');

function checkUptime() {
  const url = process.env.SITE_URL;
  
  https.get(url, (res) => {
    if (res.statusCode === 200) {
      console.log('Site is up and running');
    } else {
      console.error(`Site returned status: ${res.statusCode}`);
      sendAlert(`Site down: ${url} returned ${res.statusCode}`);
    }
  }).on('error', (err) => {
    console.error('Site is down:', err.message);
    sendAlert(`Site down: ${err.message}`);
  });
}

// Check every 5 minutes
setInterval(checkUptime, 5 * 60 * 1000);

Performance Monitoring

  • Lighthouse CI: Automated performance audits
  • Core Web Vitals: Track loading, interactivity, visual stability
  • Bundle Analysis: Monitor asset sizes and optimization
  • SEO Monitoring: Track search rankings and indexing

Automated Backups

Ensure your content is safely backed up automatically:

GitHub Repository Backup

name: Backup Repository

on:
  schedule:
    - cron: '0 2 * * 0' # Weekly at 2 AM UTC
  workflow_dispatch:

jobs:
  backup:
    runs-on: ubuntu-latest
    steps:
    - name: Checkout Repository
      uses: actions/checkout@v4
      with:
        fetch-depth: 0 # Full history
        
    - name: Create Backup Archive
      run: |
        tar -czf backup-$(date +%Y-%m-%d).tar.gz .
        
    - name: Upload to Cloud Storage
      uses: actions/upload-artifact@v3
      with:
        name: repository-backup
        path: backup-*.tar.gz
        retention-days: 90

Content-Only Backup

// scripts/backup-content.js
const fs = require('fs');
const path = require('path');
const archiver = require('archiver');

function backupContent() {
  const output = fs.createWriteStream(`backups/content-${new Date().toISOString().split('T')[0]}.zip`);
  const archive = archiver('zip', { zlib: { level: 9 } });

  output.on('close', () => {
    console.log(`Backup created: ${archive.pointer()} total bytes`);
  });

  archive.on('error', (err) => {
    throw err;
  });

  archive.pipe(output);

  // Add content directories
  archive.directory('posts/', 'posts');
  archive.directory('assets/', 'assets');
  archive.file('listlog.config.json', { name: 'listlog.config.json' });

  archive.finalize();
}

backupContent();

Cloud Storage Integration

  • AWS S3: Store backups in S3 with lifecycle policies
  • Google Drive: Sync content to Google Drive automatically
  • Dropbox: Mirror posts to Dropbox for easy access
  • Git LFS: Store large media files efficiently

Automation Best Practices

Security Considerations

  • Use secrets management for API keys and tokens
  • Validate webhook signatures to prevent unauthorized triggers
  • Limit automation permissions to minimum required scope
  • Monitor automation logs for suspicious activity

Performance Optimization

  • Cache dependencies in CI/CD workflows
  • Use incremental builds when possible
  • Optimize scheduling to avoid unnecessary runs
  • Implement failure retry logic for reliability

Monitoring and Alerts

  • Set up failure notifications for critical workflows
  • Monitor build times and optimize when needed
  • Track automation costs on cloud platforms
  • Review logs regularly for improvement opportunities
Rate Limits: Be mindful of API rate limits when setting up frequent automation. Space out requests and implement proper error handling.

Getting Started with Automation

Start simple and gradually add more automation to your workflow:

Beginner Automation

  1. Scheduled publishing — Set up basic cron-based deployment
  2. Draft management — Use published flags for content control
  3. Simple social sharing — Connect RSS to IFTTT

Intermediate Automation

  1. Headless CMS integration — Connect Forestry or Netlify CMS
  2. Webhook publishing — Build on external content changes
  3. Analytics reporting — Automated performance insights

Advanced Automation

  1. Multi-platform publishing — Sync to multiple destinations
  2. AI content enhancement — Automated SEO optimization
  3. Dynamic content generation — Data-driven posts
Pro Tip: Start with one automation at a time. Test thoroughly before adding complexity. Document your workflows for team members.