!(stud.ip-crawler-logo)[studip-crawler.png]
This is a program that downloads all files available for a given Stud.IP user. It only downloads and searches through the courses in the current semester. If you run the program again it only downloads files that have changed since the last time running it.
git clone https://github.com/tiyn/studip-crawlercd studip-crawler/src/pip3install -r requirements - install dependenciesJust run the file via python3 run.py [options].
Alternatively to python3 run.py you can give yourself permissions using
chmod +x run.py [options] and
run it with ./run.py [options].
There are several options required to work.
Run python3 run.py -h for a help menu and see which ones are important for you.
Set the following variables with the -e tag.
| Name | Usage | Default |
|---|---|---|
USER |
username on the studip server | admin |
PSWD |
password on the studip server | admin |
URL |
url of the studip server | admin |
HOST |
ip of the mysql instance to connect | mysql |
DB_USER |
username of the mysql instance to connect | root |
DB_PSWD |
password of the mysql instance to connect | root |
INTERVAl |
update interval in seconds | 86400 |
Set the following volumes with the -v tag.
| Volume-Name | Container mount | Description |
|---|---|---|
studip_data |
/studip/src/data |
directory for studip files to be saved to |
Copy docker/docker-compose.yml and change it to your needs.
Then run docker-compose up.