Open Editor →

Scheduled Tasks

Turn your scripts into automated background services. Put your Python logic on autopilot and run code even while you're offline.

Your Scripts on Autopilot

Python Online provides a built-in scheduler that allows you to execute your code at regular intervals without manual intervention. This is a set-and-forget system: once a task is scheduled, our infrastructure handles the execution, logging, and lifecycle management automatically.

Common Use Cases

  • Data Scraping: Collect news, prices, or stock data every hour.
  • Automated Reporting: Process datasets and generate a daily summary file.
  • Database Maintenance: Clean up temporary records or sync data between APIs.
  • System Checks: Monitor an external website or service and log its availability.

Creating a Scheduled Task

Automating a script is managed through the Tasks Tab in the Project Dashboard. To schedule a job, follow these steps:

  1. Identify the Entry Point: Ensure the script you want to automate is saved in your project (e.g., scripts/scraper.py).
  2. Open the Dashboard: Click your project name in the header and navigate to "Tasks."
  3. Click Schedule (+):
    • Task Name: Give your job a descriptive label (e.g., "Daily CSV Backup").
    • Script Path: Provide the path relative to your project root. (Example: src/main.py).
    • Frequency: Select how often the script should execute.
  4. Save: The task is now registered and will start its first run immediately, and will follow the scheduled interval.

Independent Execution: Scheduled tasks are treated as independent workloads. A task belonging to "Project A" will run on its schedule even if you are actively working inside "Project B" or have the IDE closed entirely.

The Execution Environment

Registered users are permitted to schedule 1 Daily Task. When the scheduler triggers, the platform boots a brand new, temporary Linux container exclusively for that task.

  • Clean Slate: The container has fresh memory. No variables from previous runs carry over.
  • The 60-Second Wall: To ensure fairness across the cluster, scheduled tasks are granted a strict 60-second execution window. If your script takes longer, the server will forcefully terminate the container to reclaim resources.

The Headless Standard

It is important to understand that scheduled tasks run in Headless Mode. Because there is no user present during execution, the environment is strictly non-interactive.

Warning: If your code contains the input() function, the execution engine will gracefully bypass the prompt and continue execution to prevent the script from hanging indefinitely.

Similarly, any attempt to render an interactive plot using matplotlib.pyplot.show() will be intercepted and discarded, as there is no visual console to display it.

Monitoring & Success Tracking

Since you aren't watching the code run, Python Online provides tools to verify that your automation is healthy.

Real-Time Status

The Tasks Dashboard displays a status indicator for every job:

  • Success: The script finished with an exit code of 0.
  • Failed: The script crashed or hit the timeout limit.
  • Running: The task is currently executing.

Persistent Task Logs

Every task maintains a History Log of its most recent execution. By clicking the "Log" icon, you can see the full stdout and stderr of the run. This includes all your print() statements and, most importantly, the Python traceback if the script failed.

Note: To keep your storage clean, logs are overwritten with each new run.

Best Practices for Background Scripts

  • Relative Paths: Always reference files using relative paths. Your script is executed with the project root as the working directory. Use open('data/output.txt', 'w') rather than absolute paths.
  • Robust Error Handling: Use try/except blocks around your main logic. If a network error occurs during a scrape, catching the error and printing a custom message will make your logs much easier to debug.
  • Resource Efficiency: Avoid infinite loops. Even Pro tasks are subject to a 1-hour "Hard Kill" timer to prevent runaway processes from consuming your workstation's RAM.