Skip to content

Web app for Scrapyd cluster management, Scrapy log analysis & visualization, Auto packaging, Timer tasks, Monitor & Alert, and Mobile UI. Docs 文档 👉

License

Notifications You must be signed in to change notification settings

my8100/scrapydweb

Folders and files

NameName
Last commit message
Last commit date

Latest commit

1341cf9 · Feb 19, 2025

History

79 Commits
Jan 12, 2025
Feb 11, 2024
Feb 16, 2025
Mar 15, 2019
Feb 16, 2025
Jul 7, 2019
Jul 7, 2019
Jan 12, 2025
Jul 19, 2019
Feb 16, 2025
Sep 30, 2018
Jan 20, 2019
Feb 19, 2025
Sep 4, 2023
Jan 12, 2025
Jan 12, 2025
Jan 12, 2025

Repository files navigation

🔤 English | 🀄 简体中文

ScrapydWeb: Web app for Scrapyd cluster management, with support for Scrapy log analysis & visualization.

PyPI - scrapydweb Version PyPI - Python Version CircleCI codecov Coverage Status Downloads - total GitHub license Twitter

servers

Scrapyd ❌ ScrapydWeb ❌ LogParser

📖 Recommended Reading

🔗 How to efficiently manage your distributed web scraping projects

🔗 How to set up Scrapyd cluster on Heroku

👀 Demo

🔗 scrapydweb.herokuapp.com

🚀 Sponsor

This project is supported by IPRoyal. You can get premium quality proxies at unbeatable prices with a discount using this referral link!

🌠 Features

View contents
  • 💠 Scrapyd Cluster Management

    • 💯 All Scrapyd JSON API Supported
    • ☑️ Group, filter and select any number of nodes
    • 🖱️ Execute command on multinodes with just a few clicks
  • 🔍 Scrapy Log Analysis

    • 📊 Stats collection
    • 📈 Progress visualization
    • 📑 Logs categorization
  • 🔋 Enhancements

    • 📦 Auto packaging
    • 🕵️‍♂️ Integrated with 🔗 LogParser
    • Timer tasks
    • 📧 Monitor & Alert
    • 📱 Mobile UI
    • 🔐 Basic auth for web UI

💻 Getting Started

View contents

⚠️ Prerequisites

Make sure that 🔗 Scrapyd has been installed and started on all of your hosts.

‼️ Note that for remote access, you have to manually set 'bind_address = 0.0.0.0' in 🔗 the configuration file of Scrapyd and restart Scrapyd to make it visible externally.

⬇️ Install

  • Use pip:
pip install scrapydweb

❗ Note that you may need to execute python -m pip install --upgrade pip first in order to get the latest version of scrapydweb, or download the tar.gz file from https://pypi.org/project/scrapydweb/#files and get it installed via pip install scrapydweb-x.x.x.tar.gz

  • Use git:
pip install --upgrade git+https://github.com/my8100/scrapydweb.git

Or:

git clone https://github.com/my8100/scrapydweb.git
cd scrapydweb
python setup.py install

▶️ Start

  1. Start ScrapydWeb via command scrapydweb. (a config file would be generated for customizing settings at the first startup.)
  2. Visit http://127.0.0.1:5000 (It's recommended to use Google Chrome for a better experience.)

🌐 Browser Support

The latest version of Google Chrome, Firefox, and Safari.

✔️ Running the tests

View contents
$ git clone https://github.com/my8100/scrapydweb.git
$ cd scrapydweb

# To create isolated Python environments
$ pip install virtualenv
$ virtualenv venv/scrapydweb
# Or specify your Python interpreter: $ virtualenv -p /usr/local/bin/python3.7 venv/scrapydweb
$ source venv/scrapydweb/bin/activate

# Install dependent libraries
(scrapydweb) $ python setup.py install
(scrapydweb) $ pip install pytest
(scrapydweb) $ pip install coverage

# Make sure Scrapyd has been installed and started, then update the custom_settings item in tests/conftest.py
(scrapydweb) $ vi tests/conftest.py
(scrapydweb) $ curl http://127.0.0.1:6800

# '-x': stop on first failure
(scrapydweb) $ coverage run --source=scrapydweb -m pytest tests/test_a_factory.py -s -vv -x
(scrapydweb) $ coverage run --source=scrapydweb -m pytest tests -s -vv --disable-warnings
(scrapydweb) $ coverage report
# To create an HTML report, check out htmlcov/index.html
(scrapydweb) $ coverage html

🏗️ Built With

View contents

📋 Changelog

Detailed changes for each release are documented in the 🔗 HISTORY.md.

👨‍💻 Author


my8100

👥 Contributors


Kaisla

©️ License

This project is licensed under the GNU General Public License v3.0 - see the 🔗 LICENSE file for details.

⭐ Stargazers over time

Stargazers over time