Skip to content

Commit

Permalink
Bump version: 1.3.0 to 1.4.0
Browse files Browse the repository at this point in the history
  • Loading branch information
my8100 committed Aug 16, 2019
1 parent 45af661 commit c688ecf
Show file tree
Hide file tree
Showing 36 changed files with 22 additions and 23 deletions.
20 changes: 4 additions & 16 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,14 +69,14 @@ jobs:
- run:
name: Setup DATA_PATH
command: |
printf "\nDATA_PATH = '"$DATA_PATH"'\n" >> scrapydweb_settings_v9.py
echo $DATA_PATH
- when:
condition: <<parameters.use-sqlite>>
steps:
- run:
name: Set DATABASE_URL to sqlite
command: |
printf "\nDATABASE_URL = '"$DATABASE_URL"'\n" >> scrapydweb_settings_v9.py
echo $DATABASE_URL
- when:
condition: <<parameters.use-postgresql>>
steps:
Expand All @@ -87,11 +87,6 @@ jobs:
# createdb: could not connect to database template1: FATAL: role "circleci" does not exist
# sudo apt install -y postgresql-client
# createdb -h localhost scrapydweb_apscheduler -O circleci
- run:
name: Set DATABASE_URL to postgresql
command: |
# postgres:https://[email protected]:5432
printf "\nDATABASE_URL = '"$DATABASE_URL"'\n" >> scrapydweb_settings_v9.py
- when:
condition: <<parameters.use-mysql>>
steps:
Expand All @@ -117,11 +112,6 @@ jobs:
# mysql -h 127.0.0.1 -u root -prootpw -e "create database scrapydweb_timertasks"
# mysql -h 127.0.0.1 -u root -prootpw -e "create database scrapydweb_metadata"
# mysql -h 127.0.0.1 -u root -prootpw -e "create database scrapydweb_jobs"
- run:
name: Set DATABASE_URL to mysql
command: |
# mysql:https://user:[email protected]:3306
printf "\nDATABASE_URL = '"$DATABASE_URL"'\n" >> scrapydweb_settings_v9.py
- run:
name: Install dependencies
Expand Down Expand Up @@ -168,10 +158,8 @@ jobs:
- run:
name: Generate report
command: |
touch scrapydweb_settings_v9.py
cat scrapydweb_settings_v9.py
echo $DATA_PATH
echo $DATABASE_URL
echo DATA_PATH: $DATA_PATH
echo DATABASE_URL: $DATABASE_URL
. venv/bin/activate
coverage report
coverage html
Expand Down
10 changes: 10 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,15 @@
Release History
===============
1.4.0 (2019-08-16)
------------------
- New Features
- Add API for sending text or alert via Slack, Telegram, or Email
- Improvements
- UI improvements on sidebar and multinode buttons
- Others
- Update config file to scrapydweb_settings_v10.py


[1.3.0](https://github.com/my8100/scrapydweb/issues?q=is%3Aclosed+milestone%3A1.3.0) (2019-08-04)
------------------
- New Features
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
- :package: **Auto packaging**
- :male_detective: **Integrated with [:link: *LogParser*](https://github.com/my8100/logparser)**
- :alarm_clock: **Timer tasks**
- :e-mail: **Email notice**
- :e-mail: **Monitor & Alert**
- :iphone: Mobile UI
- :closed_lock_with_key: Basic auth for web UI

Expand Down
2 changes: 1 addition & 1 deletion README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
- :package: **自动打包项目**
- :male_detective: **集成 [:link: *LogParser*](https://github.com/my8100/logparser)**
- :alarm_clock: **定时器任务**
- :e-mail: **邮件通知**
- :e-mail: **监控和警报**
- :iphone: 移动端 UI
- :closed_lock_with_key: web UI 支持基本身份认证

Expand Down
2 changes: 1 addition & 1 deletion scrapydweb/__version__.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# coding: utf-8

__title__ = 'scrapydweb'
__version__ = '1.3.0'
__version__ = '1.4.0'
__author__ = 'my8100'
__author_email__ = '[email protected]'
__url__ = 'https://github.com/my8100/scrapydweb'
Expand Down
4 changes: 2 additions & 2 deletions scrapydweb/default_settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -344,7 +344,7 @@

# The default is '', which means saving all program data in the Python directory.
# e.g. 'C:/Users/username/scrapydweb_data' or '/home/username/scrapydweb_data'
DATA_PATH = ''
DATA_PATH = os.environ.get('DATA_PATH', '')

# The default is '', which means saving data of Jobs and Timer Tasks in DATA_PATH using SQLite.
# The data could be also saved in MySQL or PostgreSQL backend in order to improve concurrency.
Expand All @@ -355,4 +355,4 @@
# 'postgres:https://username:[email protected]:5432'
# 'sqlite:https:///C:/Users/username'
# 'sqlite:https:////home/username'
DATABASE_URL = ''
DATABASE_URL = os.environ.get('DATABASE_URL', '')
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes
File renamed without changes
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
2 changes: 1 addition & 1 deletion scrapydweb/templates/base.html
Original file line number Diff line number Diff line change
Expand Up @@ -280,7 +280,7 @@ <h3>System</h3>
</ul>
</div>
<div class="github">
<a id="scrapydweb_version" class="request" href="https://pypi.org/project/scrapydweb/" target="_blank">v{{ SCRAPYDWEB_VERSION }} DEV</a>
<a id="scrapydweb_version" class="request" href="https://pypi.org/project/scrapydweb/" target="_blank">v{{ SCRAPYDWEB_VERSION }}</a>
<a class="github-button" href="{{ GITHUB_URL.replace('/scrapydweb', '') }}" aria-label="@my8100 on GitHub">GitHub</a>
<div>
<!-- <ul id="links"> -->
Expand Down
1 change: 1 addition & 0 deletions scrapydweb/utils/check_app_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,6 +200,7 @@ def check_assert(key, default, is_instance, allow_zero=True, non_empty=False, co

check_assert('EMAIL_PASSWORD', '', str)
if config.get('EMAIL_PASSWORD', ''):
logger.debug("Found EMAIL_PASSWORD, checking email settings")
check_assert('EMAIL_SUBJECT', '', str)
check_assert('EMAIL_USERNAME', '', str) # '' would default to config['EMAIL_SENDER']
# check_assert('EMAIL_PASSWORD', '', str, non_empty=True)
Expand Down
2 changes: 1 addition & 1 deletion scrapydweb/vars.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@

PYTHON_VERSION = '.'.join([str(n) for n in sys.version_info[:3]])
PY2 = sys.version_info.major < 3
SCRAPYDWEB_SETTINGS_PY = 'scrapydweb_settings_v9.py'
SCRAPYDWEB_SETTINGS_PY = 'scrapydweb_settings_v10.py'
try:
custom_settings_module = importlib.import_module(os.path.splitext(SCRAPYDWEB_SETTINGS_PY)[0])
except ImportError:
Expand Down

0 comments on commit c688ecf

Please sign in to comment.