Merge branch 'master' of github.com:rtfd/readthedocs.org into humitos/deprecate-python2

ghowardsit
Manuel Kaufmann 2019-01-21 18:04:42 +01:00
commit d8dacdee8f
88 changed files with 2239 additions and 1775 deletions

9
.github/config.yml vendored Normal file
View File

@ -0,0 +1,9 @@
# ProBot TODO bot
# https://probot.github.io/apps/todo/
todo:
autoAssign: false
blobLines: 7
caseSensitive: true
keyword: "TODO"

28
.github/mergeable.yml vendored Normal file
View File

@ -0,0 +1,28 @@
# ProBot Mergeable Bot
# https://github.com/jusx/mergeable
mergeable:
pull_requests:
approvals:
# Minimum of approvals needed.
min: 1
message: 'The PR must have a minimum of 1 approvals.'
description:
no_empty:
# Do not allow empty descriptions on PR.
enabled: false
message: 'Description can not be empty.'
must_exclude:
# Do not allow 'DO NOT MERGE' phrase on PR's description.
regex: 'DO NOT MERGE'
message: 'Description says that the PR should not be merged yet.'
# Do not allow 'WIP' on PR's title.
title: 'WIP'
label:
# Do not allow PR with label 'PR: work in progress'
must_exclude: 'PR: work in progress'
message: 'This PR is work in progress.'

17
.github/no-response.yml vendored Normal file
View File

@ -0,0 +1,17 @@
# ProBot No Response Bot
# https://probot.github.io/apps/no-response/
# Number of days of inactivity before an Issue is closed for lack of response
daysUntilClose: 14
# Label requiring a response
responseRequiredLabel: 'Needed: more information'
# Comment to post when closing an Issue for lack of response. Set to `false` to disable
closeComment: >
This issue has been automatically closed because
[there has been no response to our request for more information](https://docs.readthedocs.io/en/latest/contribute.html#initial-triage)
from the original author. With only the information that is currently in the issue,
we don't have enough information to take action.
Please reach out if you have or find the answers we need so that we can investigate further.
Thanks!

26
.github/stale.yml vendored Normal file
View File

@ -0,0 +1,26 @@
# ProBot Stale Bot
# https://probot.github.io/apps/stale/
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 45
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 7
# Issues with these labels will never be considered stale
exemptLabels:
- 'Accepted'
- 'Needed: design decision'
- 'Status: blocked'
# Label to use when marking an issue as stale
staleLabel: 'Status: stale'
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false

3
.gitignore vendored
View File

@ -7,6 +7,7 @@
.DS_Store
.cache
.coverage
.coverage.*
.idea
.vagrant
.vscode
@ -22,7 +23,7 @@ celerybeat-schedule.*
deploy/.vagrant
dist/*
local_settings.py
locks/*
locks/**
logs/*
media/dash
media/epub

View File

@ -3,4 +3,4 @@ formats: all
sphinx:
configuration: docs/conf.py
python:
requirements: requirements.txt
requirements: requirements/local-docs-build.txt

View File

@ -1,3 +1,56 @@
Version 2.8.5
-------------
:Date: January 15, 2019
* `@stsewd <http://github.com/stsewd>`__: Use the python path from virtualenv in Conda (`#5110 <https://github.com/rtfd/readthedocs.org/pull/5110>`__)
* `@humitos <http://github.com/humitos>`__: Feature flag to use `readthedocs/build:testing` image (`#5109 <https://github.com/rtfd/readthedocs.org/pull/5109>`__)
* `@stsewd <http://github.com/stsewd>`__: Use python from virtualenv's bin directory when executing commands (`#5107 <https://github.com/rtfd/readthedocs.org/pull/5107>`__)
* `@humitos <http://github.com/humitos>`__: Do not build projects from banned users (`#5096 <https://github.com/rtfd/readthedocs.org/pull/5096>`__)
* `@agjohnson <http://github.com/agjohnson>`__: Fix common pieces (`#5095 <https://github.com/rtfd/readthedocs.org/pull/5095>`__)
* `@rainwoodman <http://github.com/rainwoodman>`__: Suppress progress bar of the conda command. (`#5094 <https://github.com/rtfd/readthedocs.org/pull/5094>`__)
* `@humitos <http://github.com/humitos>`__: Remove unused suggestion block from 404 pages (`#5087 <https://github.com/rtfd/readthedocs.org/pull/5087>`__)
* `@humitos <http://github.com/humitos>`__: Remove header nav (Login/Logout button) on 404 pages (`#5085 <https://github.com/rtfd/readthedocs.org/pull/5085>`__)
* `@stsewd <http://github.com/stsewd>`__: Fix little typo (`#5084 <https://github.com/rtfd/readthedocs.org/pull/5084>`__)
* `@agjohnson <http://github.com/agjohnson>`__: Split up deprecated view notification to GitHub and other webhook endpoints (`#5083 <https://github.com/rtfd/readthedocs.org/pull/5083>`__)
* `@humitos <http://github.com/humitos>`__: Install ProBot (`#5082 <https://github.com/rtfd/readthedocs.org/pull/5082>`__)
* `@stsewd <http://github.com/stsewd>`__: Update docs about contributing to docs (`#5077 <https://github.com/rtfd/readthedocs.org/pull/5077>`__)
* `@humitos <http://github.com/humitos>`__: Declare and improve invoke tasks (`#5075 <https://github.com/rtfd/readthedocs.org/pull/5075>`__)
* `@davidfischer <http://github.com/davidfischer>`__: Fire a signal for domain verification (eg. for SSL) (`#5071 <https://github.com/rtfd/readthedocs.org/pull/5071>`__)
* `@agjohnson <http://github.com/agjohnson>`__: Update copy on notifications for github services deprecation (`#5067 <https://github.com/rtfd/readthedocs.org/pull/5067>`__)
* `@humitos <http://github.com/humitos>`__: Upgrade all packages with pur (`#5059 <https://github.com/rtfd/readthedocs.org/pull/5059>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Reduce logging to sentry (`#5054 <https://github.com/rtfd/readthedocs.org/pull/5054>`__)
* `@discdiver <http://github.com/discdiver>`__: fixed missing apostrophe for possessive "project's" (`#5052 <https://github.com/rtfd/readthedocs.org/pull/5052>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Template improvements in "gold/subscription_form.html" (`#5049 <https://github.com/rtfd/readthedocs.org/pull/5049>`__)
* `@merwok <http://github.com/merwok>`__: Fix link in features page (`#5048 <https://github.com/rtfd/readthedocs.org/pull/5048>`__)
* `@stsewd <http://github.com/stsewd>`__: Update webhook docs (`#5040 <https://github.com/rtfd/readthedocs.org/pull/5040>`__)
* `@stsewd <http://github.com/stsewd>`__: Remove sphinx static and template dir (`#5039 <https://github.com/rtfd/readthedocs.org/pull/5039>`__)
* `@stephenfin <http://github.com/stephenfin>`__: Add temporary method for disabling shallow cloning (#5031) (`#5036 <https://github.com/rtfd/readthedocs.org/pull/5036>`__)
* `@stsewd <http://github.com/stsewd>`__: Raise exception in failed checkout (`#5035 <https://github.com/rtfd/readthedocs.org/pull/5035>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Change default_branch value from Version.slug to Version.identifier (`#5034 <https://github.com/rtfd/readthedocs.org/pull/5034>`__)
* `@humitos <http://github.com/humitos>`__: Make wipe view not CSRF exempt (`#5025 <https://github.com/rtfd/readthedocs.org/pull/5025>`__)
* `@humitos <http://github.com/humitos>`__: Convert an IRI path to URI before setting as NGINX header (`#5024 <https://github.com/rtfd/readthedocs.org/pull/5024>`__)
* `@safwanrahman <http://github.com/safwanrahman>`__: index project asynchronously (`#5023 <https://github.com/rtfd/readthedocs.org/pull/5023>`__)
* `@stsewd <http://github.com/stsewd>`__: Keep command output when it's killed (`#5015 <https://github.com/rtfd/readthedocs.org/pull/5015>`__)
* `@stsewd <http://github.com/stsewd>`__: More hints for invalid submodules (`#5012 <https://github.com/rtfd/readthedocs.org/pull/5012>`__)
* `@ericholscher <http://github.com/ericholscher>`__: Release 2.8.4 (`#5011 <https://github.com/rtfd/readthedocs.org/pull/5011>`__)
* `@stsewd <http://github.com/stsewd>`__: Remove `auto` doctype (`#5010 <https://github.com/rtfd/readthedocs.org/pull/5010>`__)
* `@davidfischer <http://github.com/davidfischer>`__: Tweak sidebar ad priority (`#5005 <https://github.com/rtfd/readthedocs.org/pull/5005>`__)
* `@stsewd <http://github.com/stsewd>`__: Replace git status and git submodules status for gitpython (`#5002 <https://github.com/rtfd/readthedocs.org/pull/5002>`__)
* `@davidfischer <http://github.com/davidfischer>`__: Backport jquery 2432 to Read the Docs (`#5001 <https://github.com/rtfd/readthedocs.org/pull/5001>`__)
* `@stsewd <http://github.com/stsewd>`__: Refactor remove_dir (`#4994 <https://github.com/rtfd/readthedocs.org/pull/4994>`__)
* `@humitos <http://github.com/humitos>`__: Skip builds when project is not active (`#4991 <https://github.com/rtfd/readthedocs.org/pull/4991>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Make $ unselectable in docs (`#4990 <https://github.com/rtfd/readthedocs.org/pull/4990>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Remove deprecated "models.permalink" (`#4975 <https://github.com/rtfd/readthedocs.org/pull/4975>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Add validation for tags of length greater than 100 characters (`#4967 <https://github.com/rtfd/readthedocs.org/pull/4967>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Add test case for send_notifications on VersionLockedError (`#4958 <https://github.com/rtfd/readthedocs.org/pull/4958>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Remove trailing slashes on svn checkout (`#4951 <https://github.com/rtfd/readthedocs.org/pull/4951>`__)
* `@stsewd <http://github.com/stsewd>`__: Safe symlink on version deletion (`#4937 <https://github.com/rtfd/readthedocs.org/pull/4937>`__)
* `@humitos <http://github.com/humitos>`__: CRUD for EnvironmentVariables from Project's admin (`#4899 <https://github.com/rtfd/readthedocs.org/pull/4899>`__)
* `@humitos <http://github.com/humitos>`__: Notify users about the usage of deprecated webhooks (`#4898 <https://github.com/rtfd/readthedocs.org/pull/4898>`__)
* `@dojutsu-user <http://github.com/dojutsu-user>`__: Disable django guardian warning (`#4892 <https://github.com/rtfd/readthedocs.org/pull/4892>`__)
* `@humitos <http://github.com/humitos>`__: Handle 401, 403 and 404 status codes when hitting GitHub for webhook (`#4805 <https://github.com/rtfd/readthedocs.org/pull/4805>`__)
Version 2.8.4
-------------

View File

@ -1,4 +1,4 @@
Copyright (c) 2010-2017 Read the Docs, Inc & contributors
Copyright (c) 2010-2019 Read the Docs, Inc & contributors
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation

View File

@ -78,6 +78,6 @@ when you push to GitHub.
License
-------
`MIT`_ © 2010-2017 Read the Docs, Inc & contributors
`MIT`_ © 2010-2019 Read the Docs, Inc & contributors
.. _MIT: LICENSE

2
common

@ -1 +1 @@
Subproject commit 8712295c9d5acf4570350f22849cb705f522ea60
Subproject commit 46aad68c905ff843559b39cb52b5d54e586115c4

View File

@ -1,4 +1,4 @@
[rstcheck]
ignore_directives=automodule,http:get,tabs,tab
ignore_directives=automodule,http:get,tabs,tab,prompt
ignore_roles=djangosetting,setting
ignore_messages=(Duplicate implicit target name: ".*")|(Hyperlink target ".*" is not referenced)

13
docs/_static/css/sphinx_prompt_css.css vendored Normal file
View File

@ -0,0 +1,13 @@
/* CSS for sphinx-prompt */
pre.highlight {
border: 1px solid #e1e4e5;
overflow-x: auto;
margin: 1px 0 24px 0;
padding: 12px 12px;
}
pre.highlight span.prompt1 {
font-size: 12px;
line-height: 1.4;
}

View File

@ -53,9 +53,9 @@ Project list
**Example request**:
.. sourcecode:: bash
.. prompt:: bash $
$ curl https://readthedocs.org/api/v2/project/?slug=pip
curl https://readthedocs.org/api/v2/project/?slug=pip
**Example response**:
@ -232,9 +232,9 @@ Build list
**Example request**:
.. sourcecode:: bash
.. prompt:: bash $
$ curl https://readthedocs.org/api/v2/build/?project__slug=pip
curl https://readthedocs.org/api/v2/build/?project__slug=pip
**Example response**:

View File

@ -225,3 +225,8 @@ The *Sphinx* and *Mkdocs* builders set the following RTD-specific environment va
+-------------------------+--------------------------------------------------+----------------------+
| ``READTHEDOCS_PROJECT`` | The RTD name of the project which is being built | ``myexampleproject`` |
+-------------------------+--------------------------------------------------+----------------------+
.. tip::
In case extra environment variables are needed to the build process (like secrets, tokens, etc),
you can add them going to **Admin > Environment Variables** in your project. See :doc:`guides/environment-variables`.

View File

@ -4,7 +4,7 @@ Sharing
.. note:: This feature only exists on our Business offering at `readthedocs.com <https://readthedocs.com/>`_.
You can share your project with users outside of your company.
There are two way to do this:
There are two ways to do this:
* by sending them a *secret link*,
* by giving them a *password*.

View File

@ -28,6 +28,7 @@ extensions = [
'djangodocs',
'doc_extensions',
'sphinx_tabs.tabs',
'sphinx-prompt',
]
templates_path = ['_templates']
@ -82,3 +83,7 @@ html_theme_options = {
# Activate autosectionlabel plugin
autosectionlabel_prefix_document = True
def setup(app):
app.add_stylesheet('css/sphinx_prompt_css.css')

View File

@ -49,20 +49,26 @@ install `pre-commit`_ and it will automatically run different linting tools
and `yapf`_) to check your changes before you commit them. `pre-commit` will let
you know if there were any problems that is wasn't able to fix automatically.
To run the `pre-commit` command and check your changes::
To run the `pre-commit` command and check your changes:
$ pip install -U pre-commit
$ git add <your-modified-files>
$ pre-commit run
.. prompt:: bash $
or to run against a specific file::
pip install -U pre-commit
git add <your-modified-files>
pre-commit run
$ pre-commit run --files <file.py>
or to run against a specific file:
.. prompt:: bash $
pre-commit run --files <file.py>
`pre-commit` can also be run as a git pre-commit hook. You can set this up
with::
with:
$ pre-commit install
.. prompt:: bash $
pre-commit install
After this installation, the next time you run `git commit` the `pre-commit run`
command will be run immediately and will inform you of the changes and errors.

View File

@ -12,9 +12,11 @@ Installing Java
Elasticsearch requires Java 8 or later. Use `Oracle official documentation <http://www.oracle.com/technetwork/java/javase/downloads/index.html>`_.
or opensource distribution like `OpenJDK <http://openjdk.java.net/install/>`_.
After installing java, verify the installation by,::
After installing java, verify the installation by,:
$ java -version
.. prompt:: bash $
java -version
The result should be something like this::
@ -31,52 +33,68 @@ Elasticsearch can be downloaded directly from elastic.co. For Ubuntu, it's best
RTD currently uses elasticsearch 1.x which can be easily downloaded and installed from `elastic.co
<https://www.elastic.co/downloads/past-releases/elasticsearch-1-3-8/>`_.
Install the downloaded package by following command::
Install the downloaded package by following command:
$ sudo apt install .{path-to-downloaded-file}/elasticsearch-1.3.8.deb
.. prompt:: bash $
sudo apt install .{path-to-downloaded-file}/elasticsearch-1.3.8.deb
Custom setup
------------
You need the icu plugin::
You need the icu plugin:
$ elasticsearch/bin/plugin -install elasticsearch/elasticsearch-analysis-icu/2.3.0
.. prompt:: bash $
elasticsearch/bin/plugin -install elasticsearch/elasticsearch-analysis-icu/2.3.0
Running Elasticsearch from command line
---------------------------------------
Elasticsearch is not started automatically after installation. How to start and stop Elasticsearch depends on whether your system uses SysV init or systemd (used by newer distributions). You can tell which is being used by running this command::
Elasticsearch is not started automatically after installation. How to start and stop Elasticsearch depends on whether your system uses SysV init or systemd (used by newer distributions). You can tell which is being used by running this command:
$ ps -p 1
.. prompt:: bash $
ps -p 1
**Running Elasticsearch with SysV init**
Use the ``update-rc.d command`` to configure Elasticsearch to start automatically when the system boots up::
Use the ``update-rc.d command`` to configure Elasticsearch to start automatically when the system boots up:
$ sudo update-rc.d elasticsearch defaults 95 10
.. prompt:: bash $
Elasticsearch can be started and stopped using the service command::
sudo update-rc.d elasticsearch defaults 95 10
$ sudo -i service elasticsearch start
$ sudo -i service elasticsearch stop
Elasticsearch can be started and stopped using the service command:
.. prompt:: bash $
sudo -i service elasticsearch start
sudo -i service elasticsearch stop
If Elasticsearch fails to start for any reason, it will print the reason for failure to STDOUT. Log files can be found in /var/log/elasticsearch/.
**Running Elasticsearch with systemd**
To configure Elasticsearch to start automatically when the system boots up, run the following commands::
To configure Elasticsearch to start automatically when the system boots up, run the following commands:
$ sudo /bin/systemctl daemon-reload
$ sudo /bin/systemctl enable elasticsearch.service
.. prompt:: bash $
Elasticsearch can be started and stopped as follows::
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable elasticsearch.service
$ sudo systemctl start elasticsearch.service
$ sudo systemctl stop elasticsearch.service
Elasticsearch can be started and stopped as follows:
To verify run::
.. prompt:: bash $
$ curl http://localhost:9200
sudo systemctl start elasticsearch.service
sudo systemctl stop elasticsearch.service
To verify run:
.. prompt:: bash $
curl http://localhost:9200
You should get something like::
@ -97,12 +115,16 @@ You should get something like::
Index the data available at RTD database
----------------------------------------
You need to create the indexes::
You need to create the indexes:
$ python manage.py provision_elasticsearch
.. prompt:: bash $
In order to search through the RTD database, you need to index it into the elasticsearch index::
python manage.py provision_elasticsearch
$ python manage.py reindex_elasticsearch
In order to search through the RTD database, you need to index it into the elasticsearch index:
.. prompt:: bash $
python manage.py reindex_elasticsearch
You are ready to go!

View File

@ -5,23 +5,29 @@ Assumptions and Prerequisites
-----------------------------
* Debian VM provisioned with python 2.7.x
* All python dependencies and setup tools are installed ::
* All python dependencies and setup tools are installed:
$ sudo apt-get install python-setuptools
$ sudo apt-get install build-essential
$ sudo apt-get install python-dev
$ sudo apt-get install libevent-dev
$ sudo easy_install pip
.. prompt:: bash $
* Git ::
sudo apt-get install python-setuptools
sudo apt-get install build-essential
sudo apt-get install python-dev
sudo apt-get install libevent-dev
sudo easy_install pip
$ sudo apt-get install git
* Git:
.. prompt:: bash $
sudo apt-get install git
* Git repo is ``git.corp.company.com:git/docs/documentation.git``
* Source documents are in ``../docs/source``
* Sphinx ::
* Sphinx:
$ sudo pip install sphinx
.. prompt:: bash $
sudo pip install sphinx
.. note:: Not using sudo may prevent access. “error: could not create '/usr/local/lib/python2.7/dist-packages/markupsafe': Permission denied”
@ -31,42 +37,52 @@ Local RTD Setup
Install RTD
~~~~~~~~~~~
To host your documentation on a local RTD installation, set it up in your VM. ::
To host your documentation on a local RTD installation, set it up in your VM:
$ mkdir checkouts
$ cd checkouts
$ git clone https://github.com/rtfd/readthedocs.org.git
$ cd readthedocs.org
$ sudo pip install -r requirements.txt
.. prompt:: bash $
mkdir checkouts
cd checkouts
git clone https://github.com/rtfd/readthedocs.org.git
cd readthedocs.org
sudo pip install -r requirements.txt
Possible Error and Resolution
`````````````````````````````
**Error**: ``error: command 'gcc' failed with exit status 1``
**Resolution**: Run the following commands. ::
**Resolution**: Run the following commands:
$ sudo apt-get update
$ sudo apt-get install python2.7-dev tk8.5 tcl8.5 tk8.5-dev tcl8.5-dev libxml2-devel libxslt-devel
$ sudo apt-get build-dep python-imaging --fix-missing
.. prompt:: bash $
On Debian 8 (jessie) the command is slightly different ::
sudo apt-get update
sudo apt-get install python2.7-dev tk8.5 tcl8.5 tk8.5-dev tcl8.5-dev libxml2-devel libxslt-devel
sudo apt-get build-dep python-imaging --fix-missing
$ sudo apt-get update
$ sudo apt-get install python2.7-dev tk8.5 tcl8.5 tk8.5-dev tcl8.5-dev libxml2-dev libxslt-dev
$ sudo apt-get build-dep python-imaging --fix-missing
On Debian 8 (jessie) the command is slightly different:
Also don't forget to re-run the dependency installation ::
.. prompt:: bash $
$ sudo pip install -r requirements.txt
sudo apt-get update
sudo apt-get install python2.7-dev tk8.5 tcl8.5 tk8.5-dev tcl8.5-dev libxml2-dev libxslt-dev
sudo apt-get build-dep python-imaging --fix-missing
Also don't forget to re-run the dependency installation
.. prompt:: bash $
sudo pip install -r requirements.txt
Configure the RTD Server and Superuser
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1. Run the following commands. ::
1. Run the following commands:
$ ./manage.py migrate
$ ./manage.py createsuperuser
.. prompt:: bash $
./manage.py migrate
./manage.py createsuperuser
2. This will prompt you to create a superuser account for Django. Enter appropriate details. For example: ::
@ -77,10 +93,12 @@ Configure the RTD Server and Superuser
RTD Server Administration
~~~~~~~~~~~~~~~~~~~~~~~~~
Navigate to the ``../checkouts/readthedocs.org`` folder in your VM and run the following command. ::
Navigate to the ``../checkouts/readthedocs.org`` folder in your VM and run the following command:
$ ./manage.py runserver [VM IP ADDRESS]:8000
$ curl -i http://[VM IP ADDRESS]:8000
.. prompt:: bash $
./manage.py runserver [VM IP ADDRESS]:8000
curl -i http://[VM IP ADDRESS]:8000
You should now be able to log into the admin interface from any PC in your LAN at ``http://[VM IP ADDRESS]:8000/admin`` using the superuser account created in django.
@ -90,9 +108,11 @@ Go to the dashboard at ``http://[VM IP ADDRESS]:8000/dashboard`` and follow the
Example: ``git.corp.company.com:/git/docs/documentation.git``.
2. Clone the documentation sources from Git in the VM.
3. Navigate to the root path for documentation.
4. Run the following Sphinx commands. ::
4. Run the following Sphinx commands:
$ make html
.. prompt:: bash $
make html
This generates the HTML documentation site using the default Sphinx theme. Verify the output in your local documentation folder under ``../build/html``
@ -105,24 +125,30 @@ Possible Error and Resolution
**Workaround-1**
1. In your machine, navigate to the ``.ssh`` folder. ::
1. In your machine, navigate to the ``.ssh`` folder:
$ cd .ssh/
$ cat id_rsa
.. prompt:: bash $
cd .ssh/
cat id_rsa
2. Copy the entire Private Key.
3. Now, SSH to the VM.
4. Open the ``id_rsa`` file in the VM. ::
4. Open the ``id_rsa`` file in the VM:
$ vim /home/<username>/.ssh/id_rsa
.. prompt:: bash $
vim /home/<username>/.ssh/id_rsa
5. Paste the RSA key copied from your machine and save file (``Esc``. ``:wq!``).
**Workaround 2**
SSH to the VM using the ``-A`` directive. ::
SSH to the VM using the ``-A`` directive:
$ ssh document-vm -A
.. prompt:: bash $
ssh document-vm -A
This provides all permissions for that particular remote session, which are revoked when you logout.

View File

@ -230,3 +230,49 @@ What commit of Read the Docs is in production?
----------------------------------------------
We deploy readthedocs.org from the `rel` branch in our GitHub repository. You can see the latest commits that have been deployed by looking on GitHub: https://github.com/rtfd/readthedocs.org/commits/rel
How can I avoid search results having a deprecated version of my docs?
---------------------------------------------------------------------
If readers search something related to your docs in Google, it will probably return the most relevant version of your documentation.
It may happen that this version is already deprecated and you want to stop Google indexing it as a result,
and start suggesting the latest (or newer) one.
To accomplish this, you can add a ``robots.txt`` file to your documentation's root so it ends up served at the root URL of your project
(for example, https://yourproject.readthedocs.io/robots.txt).
Minimal example of ``robots.txt``
+++++++++++++++++++++++++++++++++
::
User-agent: *
Disallow: /en/deprecated-version/
Disallow: /en/2.0/
.. note::
See `Google's docs`_ for its full syntax.
This file has to be served as is under ``/robots.txt``.
Depending if you are using Sphinx or MkDocs, you will need a different configuration for this.
Sphinx
~~~~~~
Sphinx uses `html_extra`_ option to add static files to the output.
You need to create a ``robots.txt`` file and put it under the path defined in ``html_extra``.
MkDocs
~~~~~~
MkDocs needs the ``robots.txt`` to be at the directory defined at `docs_dir`_ config.
.. _Google's docs: https://support.google.com/webmasters/answer/6062608
.. _html_extra: https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-html_extra_path
.. _docs_dir: https://www.mkdocs.org/user-guide/configuration/#docs_dir

View File

@ -0,0 +1,37 @@
I Need Secrets (or Environment Variables) in my Build
=====================================================
It may happen that your documentation depends on an authenticated service to be built properly.
In this case, you will require some secrets to access these services.
Read the Docs provides a way to define environment variables for your project to be used in the build process.
All these variables will be exposed to all the commands executed when building your documentation.
To define an environment variable, you need to
#. Go to your project **Admin > Environment Variables**
#. Click on "Add Environment Variable" button
#. Input a ``Name`` and ``Value`` (your secret needed here)
#. Click "Save" button
.. note::
Values will never be exposed to users, even to owners of the project. Once you create an environment variable you won't be able to see its value anymore because of security purposes.
After adding an environment variable from your project's admin, you can access it from your build process using Python, for example:
.. code-block:: python
# conf.py
import os
import requests
# Access to our custom environment variables
username = os.environ.get('USERNAME')
password = os.environ.get('PASSWORD')
# Request a username/password protected URL
response = requests.get(
'https://httpbin.org/basic-auth/username/password',
auth=(username, password),
)

View File

@ -31,9 +31,9 @@ Create translatable files
To generate these ``.pot`` files it's needed to run this command from your ``docs/`` directory:
.. code-block:: console
.. prompt:: bash $
$ sphinx-build -b gettext . _build/gettext
sphinx-build -b gettext . _build/gettext
.. tip::
@ -57,18 +57,18 @@ We recommend using `sphinx-intl`_ tool for this workflow.
First, you need to install it:
.. code-block:: console
.. prompt:: bash $
$ pip install sphinx-intl
pip install sphinx-intl
As a second step, we want to create a directory with each translated file per target language
(in this example we are using Spanish/Argentina and Portuguese/Brazil).
This can be achieved with the following command:
.. code-block:: console
.. prompt:: bash $
$ sphinx-intl update -p _build/gettext -l es_AR -l pt_BR
sphinx-intl update -p _build/gettext -l es_AR -l pt_BR
This command will create a directory structure similar to the following
(with one ``.po`` file per ``.rst`` file in your documentation)::
@ -113,9 +113,9 @@ To do this, run this command:
.. _transifex-client: https://docs.transifex.com/client/introduction
.. code-block:: console
.. prompt:: bash $
$ pip install transifex-client
pip install transifex-client
After installing it, you need to configure your account.
For this, you need to create an API Token for your user to access this service through the command line.
@ -126,17 +126,17 @@ This can be done under your `User's Settings`_.
Now, you need to setup it to use this token:
.. code-block:: console
.. prompt:: bash $
$ tx init --token $TOKEN --no-interactive
tx init --token $TOKEN --no-interactive
The next step is to map every ``.pot`` file you have created in the previous step to a resource under Transifex.
To achieve this, you need to run this command:
.. code-block:: console
.. prompt:: bash $
$ tx config mapping-bulk \
tx config mapping-bulk \
--project $TRANSIFEX_PROJECT \
--file-extension '.pot' \
--source-file-dir docs/_build/gettext \
@ -150,17 +150,17 @@ This command will generate a file at ``.tx/config`` with all the information nee
Finally, you need to upload these files to Transifex platform so translators can start their work.
To do this, you can run this command:
.. code-block:: console
.. prompt:: bash $
$ tx push --source
tx push --source
Now, you can go to your Transifex's project and check that there is one resource per ``.rst`` file of your documentation.
After the source files are translated using Transifex, you can download all the translations for all the languages by running:
.. code-block:: console
.. prompt:: bash $
$ tx pull --all
tx pull --all
This command will leave the ``.po`` files needed for building the documentation in the target language under ``locale/<lang>/LC_MESSAGES``.
@ -176,9 +176,9 @@ Build the documentation in target language
Finally, to build our documentation in Spanish(Argentina) we need to tell Sphinx builder the target language with the following command:
.. code-block:: console
.. prompt:: bash $
$ sphinx-build -b html -D language=es_AR . _build/html/es_AR
sphinx-build -b html -D language=es_AR . _build/html/es_AR
.. note::
@ -197,21 +197,21 @@ Once you have done changes in your documentation, you may want to make these add
#. Create the ``.pot`` files:
.. code-block:: console
.. prompt:: bash $
$ sphinx-build -b gettext . _build/gettext
sphinx-build -b gettext . _build/gettext
.. For the manual workflow, we need to run this command
.. For the manual workflow, we need to run this command
$ sphinx-intl update -p _build/gettext -l es_AR -l pt_BR
$ sphinx-intl update -p _build/gettext -l es_AR -l pt_BR
#. Push new files to Transifex
.. code-block:: console
.. prompt:: bash $
$ tx push --sources
tx push --sources
Build documentation from up to date translation
@ -221,16 +221,16 @@ When translators have finished their job, you may want to update the documentati
#. Pull up to date translations from Transifex:
.. code-block:: console
.. prompt:: bash $
$ tx pull --all
tx pull --all
#. Commit and push these changes to our repo
.. code-block:: console
.. prompt:: bash $
$ git add locale/
$ git commit -m "Update translations"
$ git push
git add locale/
git commit -m "Update translations"
git push
The last ``git push`` will trigger a build per translation defined as part of your project under Read the Docs and make it immediately available.

View File

@ -35,7 +35,7 @@ Using the project admin dashboard
Once the requirements file has been created;
- Login to Read the Docs and go to the project admin dashboard.
- Go to ``Admin > Advanced Settings > Requirements file``.
- Go to **Admin > Advanced Settings > Requirements file**.
- Specify the path of the requirements file you just created. The path should be relative to the root directory of the documentation project.
Using a conda environment file

View File

@ -269,9 +269,11 @@ Compiling to MO
Gettext doesn't parse any text files, it reads a binary format for faster
performance. To compile the latest PO files in the repository, Django provides
the ``compilemessages`` management command. For example, to compile all the
available localizations, just run::
available localizations, just run:
$ python manage.py compilemessages -a
.. prompt:: bash $
python manage.py compilemessages -a
You will need to do this every time you want to push updated translations to
the live site.
@ -304,12 +306,12 @@ help pages <http://help.transifex.com/features/client/>`_.
#. Update files and push sources (English) to Transifex:
.. code-block:: console
.. prompt:: bash $
$ fab i18n_push_source
fab i18n_push_source
#. Pull changes (new translations) from Transifex:
.. code-block:: console
.. prompt:: bash $
$ fab i18n_pull
fab i18n_pull

View File

@ -20,15 +20,15 @@ Quick start
Assuming you have Python already, `install MkDocs`_:
.. sourcecode:: bash
.. prompt:: bash $
$ pip install mkdocs
pip install mkdocs
Setup your MkDocs project:
.. sourcecode:: bash
.. prompt:: bash $
$ mkdocs new .
mkdocs new .
This command creates ``mkdocs.yml`` which holds your MkDocs configuration,
and ``docs/index.md`` which is the Markdown file
@ -37,9 +37,9 @@ that is the entry point for your documentation.
You can edit this ``index.md`` file to add more details about your project
and then you can build your documentation:
.. sourcecode:: bash
.. prompt:: bash $
$ mkdocs serve
mkdocs serve
This command builds your Markdown files into HTML
and starts a development server to browse your documentation.

View File

@ -33,23 +33,23 @@ Quick start
Assuming you have Python already, `install Sphinx`_:
.. sourcecode:: bash
.. prompt:: bash $
$ pip install sphinx
pip install sphinx
Create a directory inside your project to hold your docs:
.. sourcecode:: bash
.. prompt:: bash $
$ cd /path/to/project
$ mkdir docs
cd /path/to/project
mkdir docs
Run ``sphinx-quickstart`` in there:
.. sourcecode:: bash
.. prompt:: bash $
$ cd docs
$ sphinx-quickstart
cd docs
sphinx-quickstart
This quick start will walk you through creating the basic configuration; in most cases, you
can just accept the defaults. When it's done, you'll have an ``index.rst``, a
@ -59,9 +59,9 @@ Now, edit your ``index.rst`` and add some information about your project.
Include as much detail as you like (refer to the reStructuredText_ syntax
or `this template`_ if you need help). Build them to see how they look:
.. sourcecode:: bash
.. prompt:: bash $
$ make html
make html
Your ``index.rst`` has been built into ``index.html``
in your documentation output directory (typically ``_build/html/index.html``).
@ -88,9 +88,9 @@ Using Markdown with Sphinx
You can use Markdown and reStructuredText in the same Sphinx project.
We support this natively on Read the Docs, and you can do it locally:
.. sourcecode:: bash
.. prompt:: bash $
$ pip install recommonmark
pip install recommonmark
Then in your ``conf.py``:

View File

@ -20,6 +20,8 @@ details and a list of HTTP exchanges that have taken place for the integration.
You need this information for the URL, webhook, or Payload URL needed by the
repository provider such as GitHub, GitLab, or Bitbucket.
.. _webhook-creation:
Webhook Creation
----------------
@ -36,6 +38,8 @@ As an example, the URL pattern looks like this: *readthedocs.org/api/v2/webhook/
Use this URL when setting up a new webhook with your provider -- these steps vary depending on the provider:
.. _webhook-integration-github:
GitHub
~~~~~~
@ -54,6 +58,8 @@ For a 403 error, it's likely that the Payload URL is incorrect.
.. note:: The webhook token, intended for the GitHub **Secret** field, is not yet implemented.
.. _webhook-integration-bitbucket:
Bitbucket
~~~~~~~~~
@ -64,6 +70,8 @@ Bitbucket
* Under **Triggers**, **Repository push** should be selected
* Finish by clicking **Save**
.. _webhook-integration-gitlab:
GitLab
~~~~~~
@ -74,6 +82,8 @@ GitLab
* Leave the default **Push events** selected and mark **Tag push events** also
* Finish by clicking **Add Webhook**
.. _webhook-integration-generic:
Using the generic API integration
---------------------------------
@ -137,3 +147,69 @@ Resyncing webhooks
It might be necessary to re-establish a webhook if you are noticing problems.
To resync a webhook from Read the Docs, visit the integration detail page and
follow the directions for re-syncing your repository webhook.
Troubleshooting
---------------
My project isn't automatically building
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If your project isn't automatically building, you can check your integration on
Read the Docs to see the payload sent to our servers. If there is no recent
activity on your Read the Docs project webhook integration, then it's likely
that your VCS provider is not configured correctly. If there is payload
information on your Read the Docs project, you might need to verify that your
versions are configured to build correctly.
Either way, it may help to either resync your webhook integration (see
`Resyncing webhooks`_ for information on this process), or set up an entirely
new webhook integration.
.. _webhook-github-services:
I was warned I shouldn't use GitHub Services
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Last year, GitHub announced that effective Jan 31st, 2019, GitHub Services will stop
working [1]_. This means GitHub will stop sending notifications to Read the Docs
for projects configured with the ``ReadTheDocs`` GitHub Service. If your project
has been configured on Read the Docs for a long time, you are most likely still
using this service to automatically build your project on Read the Docs.
In order for your project to continue automatically building, you will need to
configure your GitHub repository with a new webhook. You can use either a
connected GitHub account and a :ref:`GitHub webhook integration <webhook-integration-github>`
on your Read the Docs project, or you can use a
:ref:`generic webhook integration <webhook-integration-generic>` without a connected
account.
.. [1] https://developer.github.com/changes/2018-04-25-github-services-deprecation/
.. _webhook-deprecated-endpoints:
I was warned that my project won't automatically build after April 1st
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In addition to :ref:`no longer supporting GitHub Services <webhook-github-services>`,
we have decided to no longer support several other legacy incoming webhook
endpoints that were used before we introduced project webhook integrations. When
we introduced our webhook integrations, we added several features and improved
security for incoming webhooks and these features were not added to our leagcy
incoming webhooks. New projects have not been able to use our legacy incoming
webhooks since, however if you have a project that has been established for a
while, you may still be using these endpoints.
After March 1st, 2019, we will stop accepting incoming webhook notifications for
these legacy incoming webhooks. Your project will need to be reconfigured and
have a webhook integration configured, pointing to a new webhook with your VCS
provider.
In particular, the incoming webhook URLs that will be removed are:
* ``https://readthedocs.org/build``
* ``https://readthedocs.org/bitbucket``
* ``https://readthedocs.org/github`` (as noted :ref:`above <webhook-github-services>`)
* ``https://readthedocs.org/gitlab``
In order to establish a new project webhook integration, :ref:`follow
the directions for your VCS provider <webhook-creation>`

View File

@ -278,9 +278,9 @@ installed in addition to the default ``requests`` and ``simplejson``, use the
Behind the scene the following Pip command will be run:
.. code-block:: shell
.. prompt:: bash $
$ pip install .[tests,docs]
pip install .[tests,docs]
.. _issue: https://github.com/rtfd/readthedocs.org/issues

View File

@ -1,15 +1,21 @@
# -*- coding: utf-8 -*-
"""Models for the builds app."""
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
import logging
import os.path
import re
from shutil import rmtree
from builtins import object
from django.conf import settings
from django.db import models
from django.urls import reverse
from django.utils import timezone
from django.utils.encoding import python_2_unicode_compatible
from django.utils.translation import ugettext
@ -17,6 +23,7 @@ from django.utils.translation import ugettext_lazy as _
from guardian.shortcuts import assign
from jsonfield import JSONField
from taggit.managers import TaggableManager
from django.urls import reverse
from readthedocs.core.utils import broadcast
from readthedocs.projects.constants import (
@ -48,12 +55,8 @@ from .utils import (
)
from .version_slug import VersionSlugField
DEFAULT_VERSION_PRIVACY_LEVEL = getattr(
settings,
'DEFAULT_VERSION_PRIVACY_LEVEL',
'public',
)
settings, 'DEFAULT_VERSION_PRIVACY_LEVEL', 'public')
log = logging.getLogger(__name__)
@ -93,10 +96,7 @@ class Version(models.Model):
#: filesystem to determine how the paths for this version are called. It
#: must not be used for any other identifying purposes.
slug = VersionSlugField(
_('Slug'),
max_length=255,
populate_from='verbose_name',
)
_('Slug'), max_length=255, populate_from='verbose_name')
supported = models.BooleanField(_('Supported'), default=True)
active = models.BooleanField(_('Active'), default=False)
@ -114,14 +114,13 @@ class Version(models.Model):
objects = VersionManager.from_queryset(VersionQuerySet)()
class Meta:
class Meta(object):
unique_together = [('project', 'slug')]
ordering = ['-verbose_name']
permissions = (
# Translators: Permission around whether a user can view the
# version
('view_version', _('View Version')),
)
('view_version', _('View Version')),)
def __str__(self):
return ugettext(
@ -129,8 +128,7 @@ class Version(models.Model):
version=self.verbose_name,
project=self.project,
pk=self.pk,
),
)
))
@property
def config(self):
@ -141,8 +139,9 @@ class Version(models.Model):
:rtype: dict
"""
last_build = (
self.builds.filter(state='finished',
success=True).order_by('-date').first()
self.builds.filter(state='finished', success=True)
.order_by('-date')
.first()
)
return last_build.config
@ -185,9 +184,7 @@ class Version(models.Model):
# If we came that far it's not a special version nor a branch or tag.
# Therefore just return the identifier to make a safe guess.
log.debug(
'TODO: Raise an exception here. Testing what cases it happens'
)
log.debug('TODO: Raise an exception here. Testing what cases it happens')
return self.identifier
def get_absolute_url(self):
@ -201,36 +198,33 @@ class Version(models.Model):
)
private = self.privacy_level == PRIVATE
return self.project.get_docs_url(
version_slug=self.slug,
private=private,
)
version_slug=self.slug, private=private)
def save(self, *args, **kwargs): # pylint: disable=arguments-differ
"""Add permissions to the Version for all owners on save."""
from readthedocs.projects import tasks
obj = super().save(*args, **kwargs)
obj = super(Version, self).save(*args, **kwargs)
for owner in self.project.users.all():
assign('view_version', owner, self)
broadcast(
type='app',
task=tasks.symlink_project,
args=[self.project.pk],
)
type='app', task=tasks.symlink_project, args=[self.project.pk])
return obj
def delete(self, *args, **kwargs): # pylint: disable=arguments-differ
from readthedocs.projects import tasks
log.info('Removing files for version %s', self.slug)
broadcast(
type='app', task=tasks.clear_artifacts,
args=[self.get_artifact_paths()]
type='app',
task=tasks.remove_dirs,
args=[self.get_artifact_paths()],
)
project_pk = self.project.pk
super(Version, self).delete(*args, **kwargs)
broadcast(
type='app',
task=tasks.symlink_project,
args=[self.project.pk],
args=[project_pk],
)
super().delete(*args, **kwargs)
@property
def identifier_friendly(self):
@ -259,27 +253,19 @@ class Version(models.Model):
data['PDF'] = project.get_production_media_url('pdf', self.slug)
if project.has_htmlzip(self.slug):
data['HTML'] = project.get_production_media_url(
'htmlzip',
self.slug,
)
'htmlzip', self.slug)
if project.has_epub(self.slug):
data['Epub'] = project.get_production_media_url(
'epub',
self.slug,
)
'epub', self.slug)
else:
if project.has_pdf(self.slug):
data['pdf'] = project.get_production_media_url('pdf', self.slug)
if project.has_htmlzip(self.slug):
data['htmlzip'] = project.get_production_media_url(
'htmlzip',
self.slug,
)
'htmlzip', self.slug)
if project.has_epub(self.slug):
data['epub'] = project.get_production_media_url(
'epub',
self.slug,
)
'epub', self.slug)
return data
def get_conf_py_path(self):
@ -307,8 +293,7 @@ class Version(models.Model):
paths.append(
self.project.get_production_media_path(
type_=type_,
version_slug=self.slug,
),
version_slug=self.slug),
)
paths.append(self.project.rtd_build_path(version=self.slug))
@ -330,12 +315,7 @@ class Version(models.Model):
log.exception('Build path cleanup failed')
def get_github_url(
self,
docroot,
filename,
source_suffix='.rst',
action='view',
):
self, docroot, filename, source_suffix='.rst', action='view'):
"""
Return a GitHub URL for a given filename.
@ -377,12 +357,7 @@ class Version(models.Model):
)
def get_gitlab_url(
self,
docroot,
filename,
source_suffix='.rst',
action='view',
):
self, docroot, filename, source_suffix='.rst', action='view'):
repo_url = self.project.repo
if 'gitlab' not in repo_url:
return ''
@ -467,7 +442,7 @@ class APIVersion(Version):
del kwargs[key]
except KeyError:
pass
super().__init__(*args, **kwargs)
super(APIVersion, self).__init__(*args, **kwargs)
def save(self, *args, **kwargs):
return 0
@ -479,28 +454,13 @@ class Build(models.Model):
"""Build data."""
project = models.ForeignKey(
Project,
verbose_name=_('Project'),
related_name='builds',
)
Project, verbose_name=_('Project'), related_name='builds')
version = models.ForeignKey(
Version,
verbose_name=_('Version'),
null=True,
related_name='builds',
)
Version, verbose_name=_('Version'), null=True, related_name='builds')
type = models.CharField(
_('Type'),
max_length=55,
choices=BUILD_TYPES,
default='html',
)
_('Type'), max_length=55, choices=BUILD_TYPES, default='html')
state = models.CharField(
_('State'),
max_length=55,
choices=BUILD_STATE,
default='finished',
)
_('State'), max_length=55, choices=BUILD_STATE, default='finished')
date = models.DateTimeField(_('Date'), auto_now_add=True)
success = models.BooleanField(_('Success'), default=True)
@ -510,26 +470,16 @@ class Build(models.Model):
error = models.TextField(_('Error'), default='', blank=True)
exit_code = models.IntegerField(_('Exit code'), null=True, blank=True)
commit = models.CharField(
_('Commit'),
max_length=255,
null=True,
blank=True,
)
_('Commit'), max_length=255, null=True, blank=True)
_config = JSONField(_('Configuration used in the build'), default=dict)
length = models.IntegerField(_('Build Length'), null=True, blank=True)
builder = models.CharField(
_('Builder'),
max_length=255,
null=True,
blank=True,
)
_('Builder'), max_length=255, null=True, blank=True)
cold_storage = models.NullBooleanField(
_('Cold Storage'),
help_text='Build steps stored outside the database.',
)
_('Cold Storage'), help_text='Build steps stored outside the database.')
# Manager
@ -537,13 +487,13 @@ class Build(models.Model):
CONFIG_KEY = '__config'
class Meta:
class Meta(object):
ordering = ['-date']
get_latest_by = 'date'
index_together = [['version', 'state', 'type']]
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
super(Build, self).__init__(*args, **kwargs)
self._config_changed = False
@property
@ -556,11 +506,14 @@ class Build(models.Model):
date = self.date or timezone.now()
if self.project is not None and self.version is not None:
return (
Build.objects.filter(
Build.objects
.filter(
project=self.project,
version=self.version,
date__lt=date,
).order_by('-date').first()
)
.order_by('-date')
.first()
)
return None
@ -570,9 +523,9 @@ class Build(models.Model):
Get the config used for this build.
Since we are saving the config into the JSON field only when it differs
from the previous one, this helper returns the correct JSON used in this
Build object (it could be stored in this object or one of the previous
ones).
from the previous one, this helper returns the correct JSON used in
this Build object (it could be stored in this object or one of the
previous ones).
"""
if self.CONFIG_KEY in self._config:
return Build.objects.get(pk=self._config[self.CONFIG_KEY])._config
@ -600,11 +553,11 @@ class Build(models.Model):
"""
if self.pk is None or self._config_changed:
previous = self.previous
if (previous is not None and self._config and
self._config == previous.config):
if (previous is not None and
self._config and self._config == previous.config):
previous_pk = previous._config.get(self.CONFIG_KEY, previous.pk)
self._config = {self.CONFIG_KEY: previous_pk}
super().save(*args, **kwargs)
super(Build, self).save(*args, **kwargs)
self._config_changed = False
def __str__(self):
@ -615,8 +568,7 @@ class Build(models.Model):
self.project.users.all().values_list('username', flat=True),
),
pk=self.pk,
),
)
))
def get_absolute_url(self):
return reverse('builds_detail', args=[self.project.slug, self.pk])
@ -627,7 +579,7 @@ class Build(models.Model):
return self.state == BUILD_STATE_FINISHED
class BuildCommandResultMixin:
class BuildCommandResultMixin(object):
"""
Mixin for common command result methods/properties.
@ -657,10 +609,7 @@ class BuildCommandResult(BuildCommandResultMixin, models.Model):
"""Build command for a ``Build``."""
build = models.ForeignKey(
Build,
verbose_name=_('Build'),
related_name='commands',
)
Build, verbose_name=_('Build'), related_name='commands')
command = models.TextField(_('Command'))
description = models.TextField(_('Description'), blank=True)
@ -670,7 +619,7 @@ class BuildCommandResult(BuildCommandResultMixin, models.Model):
start_time = models.DateTimeField(_('Start time'))
end_time = models.DateTimeField(_('End time'))
class Meta:
class Meta(object):
ordering = ['start_time']
get_latest_by = 'start_time'
@ -679,8 +628,7 @@ class BuildCommandResult(BuildCommandResultMixin, models.Model):
def __str__(self):
return (
ugettext('Build command {pk} for build {build}')
.format(pk=self.pk, build=self.build)
)
.format(pk=self.pk, build=self.build))
@property
def run_time(self):

View File

@ -3,10 +3,13 @@
# pylint: disable=too-many-lines
"""Build configuration for rtd."""
from __future__ import division, print_function, unicode_literals
import os
import re
from contextlib import contextmanager
import six
from readthedocs.projects.constants import DOCUMENTATION_CHOICES
from .find import find_one
@ -18,13 +21,11 @@ from .validation import (
validate_bool,
validate_choice,
validate_dict,
validate_directory,
validate_file,
validate_list,
validate_string,
)
__all__ = (
'ALL',
'load',
@ -40,12 +41,8 @@ CONFIG_FILENAME_REGEX = r'^\.?readthedocs.ya?ml$'
CONFIG_NOT_SUPPORTED = 'config-not-supported'
VERSION_INVALID = 'version-invalid'
BASE_INVALID = 'base-invalid'
BASE_NOT_A_DIR = 'base-not-a-directory'
CONFIG_SYNTAX_INVALID = 'config-syntax-invalid'
CONFIG_REQUIRED = 'config-required'
NAME_REQUIRED = 'name-required'
NAME_INVALID = 'name-invalid'
CONF_FILE_REQUIRED = 'conf-file-required'
PYTHON_INVALID = 'python-invalid'
SUBMODULES_INVALID = 'submodules-invalid'
@ -82,7 +79,7 @@ class ConfigError(Exception):
def __init__(self, message, code):
self.code = code
super().__init__(message)
super(ConfigError, self).__init__(message)
class ConfigOptionNotSupportedError(ConfigError):
@ -94,9 +91,9 @@ class ConfigOptionNotSupportedError(ConfigError):
template = (
'The "{}" configuration option is not supported in this version'
)
super().__init__(
super(ConfigOptionNotSupportedError, self).__init__(
template.format(self.configuration),
CONFIG_NOT_SUPPORTED,
CONFIG_NOT_SUPPORTED
)
@ -115,10 +112,10 @@ class InvalidConfig(ConfigError):
code=code,
error=error_message,
)
super().__init__(message, code=code)
super(InvalidConfig, self).__init__(message, code=code)
class BuildConfigBase:
class BuildConfigBase(object):
"""
Config that handles the build of one particular documentation.
@ -137,15 +134,9 @@ class BuildConfigBase:
"""
PUBLIC_ATTRIBUTES = [
'version',
'formats',
'python',
'conda',
'build',
'doctype',
'sphinx',
'mkdocs',
'submodules',
'version', 'formats', 'python',
'conda', 'build', 'doctype',
'sphinx', 'mkdocs', 'submodules',
]
version = None
@ -232,7 +223,7 @@ class BuildConfigBase:
@property
def python_interpreter(self):
ver = self.python_full_version
return 'python{}'.format(ver)
return 'python{0}'.format(ver)
@property
def python_full_version(self):
@ -241,7 +232,9 @@ class BuildConfigBase:
# Get the highest version of the major series version if user only
# gave us a version of '2', or '3'
ver = max(
v for v in self.get_valid_python_versions() if v < ver + 1
v
for v in self.get_valid_python_versions()
if v < ver + 1
)
return ver
@ -264,12 +257,6 @@ class BuildConfigV1(BuildConfigBase):
"""Version 1 of the configuration file."""
BASE_INVALID_MESSAGE = 'Invalid value for base: {base}'
BASE_NOT_A_DIR_MESSAGE = '"base" is not a directory: {base}'
NAME_REQUIRED_MESSAGE = 'Missing key "name"'
NAME_INVALID_MESSAGE = (
'Invalid name "{name}". Valid values must match {name_re}'
)
CONF_FILE_REQUIRED_MESSAGE = 'Missing key "conf_file"'
PYTHON_INVALID_MESSAGE = '"python" section must be a mapping.'
PYTHON_EXTRA_REQUIREMENTS_INVALID_MESSAGE = (
@ -307,66 +294,17 @@ class BuildConfigV1(BuildConfigBase):
``readthedocs.yml`` config file if not set
"""
# Validate env_config.
# TODO: this isn't used
self._config['output_base'] = self.validate_output_base()
# Validate the build environment first
# Must happen before `validate_python`!
self._config['build'] = self.validate_build()
# Validate raw_config. Order matters.
# TODO: this isn't used
self._config['name'] = self.validate_name()
# TODO: this isn't used
self._config['base'] = self.validate_base()
self._config['python'] = self.validate_python()
self._config['formats'] = self.validate_formats()
self._config['conda'] = self.validate_conda()
self._config['requirements_file'] = self.validate_requirements_file()
def validate_output_base(self):
"""Validates that ``output_base`` exists and set its absolute path."""
assert 'output_base' in self.env_config, (
'"output_base" required in "env_config"'
)
output_base = os.path.abspath(
os.path.join(
self.env_config.get('output_base', self.base_path),
),
)
return output_base
def validate_name(self):
"""Validates that name exists."""
name = self.raw_config.get('name', None)
if not name:
name = self.env_config.get('name', None)
if not name:
self.error('name', self.NAME_REQUIRED_MESSAGE, code=NAME_REQUIRED)
name_re = r'^[-_.0-9a-zA-Z]+$'
if not re.match(name_re, name):
self.error(
'name',
self.NAME_INVALID_MESSAGE.format(
name=name,
name_re=name_re,
),
code=NAME_INVALID,
)
return name
def validate_base(self):
"""Validates that path is a valid directory."""
if 'base' in self.raw_config:
base = self.raw_config['base']
else:
base = self.base_path
with self.catch_validation_error('base'):
base = validate_directory(base, self.base_path)
return base
def validate_build(self):
"""
Validate the build config settings.
@ -402,16 +340,12 @@ class BuildConfigV1(BuildConfigBase):
# Prepend proper image name to user's image name
build['image'] = '{}:{}'.format(
DOCKER_DEFAULT_IMAGE,
build['image'],
build['image']
)
# Update docker default settings from image name
if build['image'] in DOCKER_IMAGE_SETTINGS:
self.env_config.update(DOCKER_IMAGE_SETTINGS[build['image']],)
# Update docker settings from user config
if 'DOCKER_IMAGE_SETTINGS' in self.env_config and \
build['image'] in self.env_config['DOCKER_IMAGE_SETTINGS']:
self.env_config.update(
self.env_config['DOCKER_IMAGE_SETTINGS'][build['image']],
DOCKER_IMAGE_SETTINGS[build['image']]
)
# Allow to override specific project
@ -439,22 +373,20 @@ class BuildConfigV1(BuildConfigBase):
self.error(
'python',
self.PYTHON_INVALID_MESSAGE,
code=PYTHON_INVALID,
)
code=PYTHON_INVALID)
# Validate use_system_site_packages.
if 'use_system_site_packages' in raw_python:
with self.catch_validation_error('python.use_system_site_packages',):
with self.catch_validation_error(
'python.use_system_site_packages'):
python['use_system_site_packages'] = validate_bool(
raw_python['use_system_site_packages'],
)
raw_python['use_system_site_packages'])
# Validate pip_install.
if 'pip_install' in raw_python:
with self.catch_validation_error('python.pip_install'):
python['install_with_pip'] = validate_bool(
raw_python['pip_install'],
)
raw_python['pip_install'])
# Validate extra_requirements.
if 'extra_requirements' in raw_python:
@ -463,30 +395,29 @@ class BuildConfigV1(BuildConfigBase):
self.error(
'python.extra_requirements',
self.PYTHON_EXTRA_REQUIREMENTS_INVALID_MESSAGE,
code=PYTHON_INVALID,
)
code=PYTHON_INVALID)
if not python['install_with_pip']:
python['extra_requirements'] = []
else:
for extra_name in raw_extra_requirements:
with self.catch_validation_error('python.extra_requirements',):
with self.catch_validation_error(
'python.extra_requirements'):
python['extra_requirements'].append(
validate_string(extra_name),
validate_string(extra_name)
)
# Validate setup_py_install.
if 'setup_py_install' in raw_python:
with self.catch_validation_error('python.setup_py_install'):
python['install_with_setup'] = validate_bool(
raw_python['setup_py_install'],
)
raw_python['setup_py_install'])
if 'version' in raw_python:
with self.catch_validation_error('python.version'):
# Try to convert strings to an int first, to catch '2', then
# a float, to catch '2.7'
version = raw_python['version']
if isinstance(version, str):
if isinstance(version, six.string_types):
try:
version = int(version)
except ValueError:
@ -513,8 +444,7 @@ class BuildConfigV1(BuildConfigBase):
if 'file' in raw_conda:
with self.catch_validation_error('conda.file'):
conda_environment = validate_file(
raw_conda['file'],
self.base_path,
raw_conda['file'], self.base_path
)
conda['environment'] = conda_environment
@ -548,21 +478,6 @@ class BuildConfigV1(BuildConfigBase):
return formats
@property
def name(self):
"""The project name."""
return self._config['name']
@property
def base(self):
"""The base directory."""
return self._config['base']
@property
def output_base(self):
"""The output base."""
return self._config['output_base']
@property
def formats(self):
"""The documentation formats to be built."""
@ -735,7 +650,7 @@ class BuildConfigV2(BuildConfigBase):
python = {}
with self.catch_validation_error('python.version'):
version = self.pop_config('python.version', 3)
if isinstance(version, str):
if isinstance(version, six.string_types):
try:
version = int(version)
except ValueError:
@ -767,8 +682,7 @@ class BuildConfigV2(BuildConfigBase):
with self.catch_validation_error('python.extra_requirements'):
extra_requirements = self.pop_config(
'python.extra_requirements',
[],
'python.extra_requirements', []
)
extra_requirements = validate_list(extra_requirements)
if extra_requirements and not python['install_with_pip']:
@ -886,8 +800,7 @@ class BuildConfigV2(BuildConfigBase):
if not configuration:
configuration = None
configuration = self.pop_config(
'sphinx.configuration',
configuration,
'sphinx.configuration', configuration
)
if configuration is not None:
configuration = validate_file(configuration, self.base_path)
@ -903,8 +816,9 @@ class BuildConfigV2(BuildConfigBase):
"""
Validates that the doctype is the same as the admin panel.
This a temporal validation, as the configuration file should support per
version doctype, but we need to adapt the rtd code for that.
This a temporal validation, as the configuration file
should support per version doctype, but we need to
adapt the rtd code for that.
"""
dashboard_doctype = self.defaults.get('doctype', 'sphinx')
if self.doctype != dashboard_doctype:
@ -914,7 +828,7 @@ class BuildConfigV2(BuildConfigBase):
if dashboard_doctype == 'mkdocs' or not self.sphinx:
error_msg += ' but there is no "{}" key specified.'.format(
'mkdocs' if dashboard_doctype == 'mkdocs' else 'sphinx',
'mkdocs' if dashboard_doctype == 'mkdocs' else 'sphinx'
)
else:
error_msg += ' but your "sphinx.builder" key does not match.'
@ -976,8 +890,8 @@ class BuildConfigV2(BuildConfigBase):
"""
Checks that we don't have extra keys (invalid ones).
This should be called after all the validations are done and all keys
are popped from `self.raw_config`.
This should be called after all the validations are done
and all keys are popped from `self.raw_config`.
"""
msg = (
'Invalid configuration option: {}. '
@ -1069,7 +983,7 @@ def load(path, env_config):
if not filename:
raise ConfigError(
'No configuration file found',
code=CONFIG_REQUIRED,
code=CONFIG_REQUIRED
)
with open(filename, 'r') as configuration_file:
try:

View File

@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
from __future__ import division, print_function, unicode_literals
import os
import re
import textwrap
@ -22,8 +24,6 @@ from readthedocs.config.config import (
CONFIG_NOT_SUPPORTED,
CONFIG_REQUIRED,
INVALID_KEY,
NAME_INVALID,
NAME_REQUIRED,
PYTHON_INVALID,
VERSION_INVALID,
)
@ -32,49 +32,19 @@ from readthedocs.config.validation import (
INVALID_BOOL,
INVALID_CHOICE,
INVALID_LIST,
INVALID_PATH,
INVALID_STRING,
VALUE_NOT_FOUND,
ValidationError,
)
from .utils import apply_fs
env_config = {
'output_base': '/tmp',
}
minimal_config = {
'name': 'docs',
}
config_with_explicit_empty_list = {
'readthedocs.yml': '''
name: docs
formats: []
''',
}
minimal_config_dir = {
'readthedocs.yml': '''\
name: docs
''',
}
multiple_config_dir = {
'readthedocs.yml': '''
name: first
---
name: second
''',
'nested': minimal_config_dir,
}
yaml_extension_config_dir = {
'readthedocs.yaml': '''\
name: docs
type: sphinx
'''
yaml_config_dir = {
'readthedocs.yml': textwrap.dedent(
'''
formats:
- pdf
'''
),
}
@ -86,18 +56,6 @@ def get_build_config(config, env_config=None, source_file='readthedocs.yml'):
)
def get_env_config(extra=None):
"""Get the minimal env_config for the configuration object."""
defaults = {
'output_base': '',
'name': 'name',
}
if extra is None:
extra = {}
defaults.update(extra)
return defaults
@pytest.mark.parametrize('files', [
{'readthedocs.ymlmore': ''}, {'first': {'readthedocs.yml': ''}},
{'startreadthedocs.yml': ''}, {'second': {'confuser.txt': 'content'}},
@ -109,7 +67,7 @@ def test_load_no_config_file(tmpdir, files):
apply_fs(tmpdir, files)
base = str(tmpdir)
with raises(ConfigError) as e:
load(base, env_config)
load(base, {})
assert e.value.code == CONFIG_REQUIRED
@ -119,13 +77,13 @@ def test_load_empty_config_file(tmpdir):
})
base = str(tmpdir)
with raises(ConfigError):
load(base, env_config)
load(base, {})
def test_minimal_config(tmpdir):
apply_fs(tmpdir, minimal_config_dir)
apply_fs(tmpdir, yaml_config_dir)
base = str(tmpdir)
build = load(base, env_config)
build = load(base, {})
assert isinstance(build, BuildConfigV1)
@ -136,7 +94,7 @@ def test_load_version1(tmpdir):
''')
})
base = str(tmpdir)
build = load(base, get_env_config({'allow_v2': True}))
build = load(base, {'allow_v2': True})
assert isinstance(build, BuildConfigV1)
@ -147,7 +105,7 @@ def test_load_version2(tmpdir):
''')
})
base = str(tmpdir)
build = load(base, get_env_config({'allow_v2': True}))
build = load(base, {'allow_v2': True})
assert isinstance(build, BuildConfigV2)
@ -159,83 +117,70 @@ def test_load_unknow_version(tmpdir):
})
base = str(tmpdir)
with raises(ConfigError) as excinfo:
load(base, get_env_config({'allow_v2': True}))
load(base, {'allow_v2': True})
assert excinfo.value.code == VERSION_INVALID
def test_yaml_extension(tmpdir):
"""Make sure it's capable of loading the 'readthedocs' file with a 'yaml' extension."""
apply_fs(tmpdir, yaml_extension_config_dir)
apply_fs(tmpdir, {
'readthedocs.yaml': textwrap.dedent(
'''
python:
version: 3
'''
),
})
base = str(tmpdir)
config = load(base, env_config)
config = load(base, {})
assert isinstance(config, BuildConfigV1)
def test_build_config_has_source_file(tmpdir):
base = str(apply_fs(tmpdir, minimal_config_dir))
build = load(base, env_config)
base = str(apply_fs(tmpdir, yaml_config_dir))
build = load(base, {})
assert build.source_file == os.path.join(base, 'readthedocs.yml')
def test_build_config_has_list_with_single_empty_value(tmpdir):
base = str(apply_fs(tmpdir, config_with_explicit_empty_list))
build = load(base, env_config)
base = str(apply_fs(tmpdir, {
'readthedocs.yml': textwrap.dedent(
'''
formats: []
'''
)
}))
build = load(base, {})
assert isinstance(build, BuildConfigV1)
assert build.formats == []
def test_config_requires_name():
build = BuildConfigV1(
{'output_base': ''},
{},
source_file='readthedocs.yml',
)
with raises(InvalidConfig) as excinfo:
build.validate()
assert excinfo.value.key == 'name'
assert excinfo.value.code == NAME_REQUIRED
def test_build_requires_valid_name():
build = BuildConfigV1(
{'output_base': ''},
{'name': 'with/slashes'},
source_file='readthedocs.yml',
)
with raises(InvalidConfig) as excinfo:
build.validate()
assert excinfo.value.key == 'name'
assert excinfo.value.code == NAME_INVALID
def test_version():
build = get_build_config({}, get_env_config())
build = get_build_config({})
assert build.version == '1'
def test_doc_type():
build = get_build_config(
{},
get_env_config(
{
'defaults': {
'doctype': 'sphinx',
},
}
)
{
'defaults': {
'doctype': 'sphinx',
},
}
)
build.validate()
assert build.doctype == 'sphinx'
def test_empty_python_section_is_valid():
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
assert build.python
def test_python_section_must_be_dict():
build = get_build_config({'python': 123}, get_env_config())
build = get_build_config({'python': 123})
with raises(InvalidConfig) as excinfo:
build.validate()
assert excinfo.value.key == 'python'
@ -243,7 +188,7 @@ def test_python_section_must_be_dict():
def test_use_system_site_packages_defaults_to_false():
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
# Default is False.
assert not build.python.use_system_site_packages
@ -254,22 +199,22 @@ def test_use_system_site_packages_repects_default_value(value):
defaults = {
'use_system_packages': value,
}
build = get_build_config({}, get_env_config({'defaults': defaults}))
build = get_build_config({}, {'defaults': defaults})
build.validate()
assert build.python.use_system_site_packages is value
def test_python_pip_install_default():
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
# Default is False.
assert build.python.install_with_pip is False
class TestValidatePythonExtraRequirements:
class TestValidatePythonExtraRequirements(object):
def test_it_defaults_to_list(self):
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
# Default is an empty list.
assert build.python.extra_requirements == []
@ -277,7 +222,6 @@ class TestValidatePythonExtraRequirements:
def test_it_validates_is_a_list(self):
build = get_build_config(
{'python': {'extra_requirements': 'invalid'}},
get_env_config(),
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -294,23 +238,21 @@ class TestValidatePythonExtraRequirements:
'extra_requirements': ['tests'],
},
},
get_env_config(),
)
build.validate()
validate_string.assert_any_call('tests')
class TestValidateUseSystemSitePackages:
class TestValidateUseSystemSitePackages(object):
def test_it_defaults_to_false(self):
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
assert build.python.use_system_site_packages is False
def test_it_validates_value(self):
build = get_build_config(
{'python': {'use_system_site_packages': 'invalid'}},
get_env_config(),
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -322,23 +264,21 @@ class TestValidateUseSystemSitePackages:
validate_bool.return_value = True
build = get_build_config(
{'python': {'use_system_site_packages': 'to-validate'}},
get_env_config(),
)
build.validate()
validate_bool.assert_any_call('to-validate')
class TestValidateSetupPyInstall:
class TestValidateSetupPyInstall(object):
def test_it_defaults_to_false(self):
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
assert build.python.install_with_setup is False
def test_it_validates_value(self):
build = get_build_config(
{'python': {'setup_py_install': 'this-is-string'}},
get_env_config(),
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -350,16 +290,15 @@ class TestValidateSetupPyInstall:
validate_bool.return_value = True
build = get_build_config(
{'python': {'setup_py_install': 'to-validate'}},
get_env_config(),
)
build.validate()
validate_bool.assert_any_call('to-validate')
class TestValidatePythonVersion:
class TestValidatePythonVersion(object):
def test_it_defaults_to_a_valid_version(self):
build = get_build_config({'python': {}}, get_env_config())
build = get_build_config({'python': {}})
build.validate()
assert build.python.version == 2
assert build.python_interpreter == 'python2.7'
@ -368,7 +307,6 @@ class TestValidatePythonVersion:
def test_it_supports_other_versions(self):
build = get_build_config(
{'python': {'version': 3.5}},
get_env_config(),
)
build.validate()
assert build.python.version == 3.5
@ -378,7 +316,6 @@ class TestValidatePythonVersion:
def test_it_validates_versions_out_of_range(self):
build = get_build_config(
{'python': {'version': 1.0}},
get_env_config(),
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -388,7 +325,6 @@ class TestValidatePythonVersion:
def test_it_validates_wrong_type(self):
build = get_build_config(
{'python': {'version': 'this-is-string'}},
get_env_config(),
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -398,7 +334,6 @@ class TestValidatePythonVersion:
def test_it_validates_wrong_type_right_value(self):
build = get_build_config(
{'python': {'version': '3.5'}},
get_env_config(),
)
build.validate()
assert build.python.version == 3.5
@ -407,7 +342,6 @@ class TestValidatePythonVersion:
build = get_build_config(
{'python': {'version': '3'}},
get_env_config(),
)
build.validate()
assert build.python.version == 3
@ -417,12 +351,10 @@ class TestValidatePythonVersion:
def test_it_validates_env_supported_versions(self):
build = get_build_config(
{'python': {'version': 3.6}},
env_config=get_env_config(
{
'python': {'supported_versions': [3.5]},
'build': {'image': 'custom'},
}
)
env_config={
'python': {'supported_versions': [3.5]},
'build': {'image': 'custom'},
},
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -431,12 +363,10 @@ class TestValidatePythonVersion:
build = get_build_config(
{'python': {'version': 3.6}},
env_config=get_env_config(
{
'python': {'supported_versions': [3.5, 3.6]},
'build': {'image': 'custom'},
}
)
env_config={
'python': {'supported_versions': [3.5, 3.6]},
'build': {'image': 'custom'},
},
)
build.validate()
assert build.python.version == 3.6
@ -450,43 +380,42 @@ class TestValidatePythonVersion:
}
build = get_build_config(
{},
get_env_config({'defaults': defaults}),
{'defaults': defaults},
)
build.validate()
assert build.python.version == value
class TestValidateFormats:
class TestValidateFormats(object):
def test_it_defaults_to_empty(self):
build = get_build_config({}, get_env_config())
build = get_build_config({})
build.validate()
assert build.formats == []
def test_it_gets_set_correctly(self):
build = get_build_config({'formats': ['pdf']}, get_env_config())
build = get_build_config({'formats': ['pdf']})
build.validate()
assert build.formats == ['pdf']
def test_formats_can_be_null(self):
build = get_build_config({'formats': None}, get_env_config())
build = get_build_config({'formats': None})
build.validate()
assert build.formats == []
def test_formats_with_previous_none(self):
build = get_build_config({'formats': ['none']}, get_env_config())
build = get_build_config({'formats': ['none']})
build.validate()
assert build.formats == []
def test_formats_can_be_empty(self):
build = get_build_config({'formats': []}, get_env_config())
build = get_build_config({'formats': []})
build.validate()
assert build.formats == []
def test_all_valid_formats(self):
build = get_build_config(
{'formats': ['pdf', 'htmlzip', 'epub']},
get_env_config()
)
build.validate()
assert build.formats == ['pdf', 'htmlzip', 'epub']
@ -494,7 +423,6 @@ class TestValidateFormats:
def test_cant_have_none_as_format(self):
build = get_build_config(
{'formats': ['htmlzip', None]},
get_env_config()
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -504,7 +432,6 @@ class TestValidateFormats:
def test_formats_have_only_allowed_values(self):
build = get_build_config(
{'formats': ['htmlzip', 'csv']},
get_env_config()
)
with raises(InvalidConfig) as excinfo:
build.validate()
@ -512,7 +439,7 @@ class TestValidateFormats:
assert excinfo.value.code == INVALID_CHOICE
def test_only_list_type(self):
build = get_build_config({'formats': 'no-list'}, get_env_config())
build = get_build_config({'formats': 'no-list'})
with raises(InvalidConfig) as excinfo:
build.validate()
assert excinfo.value.key == 'format'
@ -521,75 +448,23 @@ class TestValidateFormats:
def test_valid_build_config():
build = BuildConfigV1(
env_config,
minimal_config,
{},
{},
source_file='readthedocs.yml',
)
build.validate()
assert build.name == 'docs'
assert build.base
assert build.python
assert build.python.install_with_setup is False
assert build.python.install_with_pip is False
assert build.python.use_system_site_packages is False
assert build.output_base
class TestValidateBase:
def test_it_validates_to_abspath(self, tmpdir):
apply_fs(tmpdir, {'configs': minimal_config, 'docs': {}})
with tmpdir.as_cwd():
source_file = str(tmpdir.join('configs', 'readthedocs.yml'))
build = BuildConfigV1(
get_env_config(),
{'base': '../docs'},
source_file=source_file,
)
build.validate()
assert build.base == str(tmpdir.join('docs'))
@patch('readthedocs.config.config.validate_directory')
def test_it_uses_validate_directory(self, validate_directory):
validate_directory.return_value = 'path'
build = get_build_config({'base': '../my-path'}, get_env_config())
build.validate()
# Test for first argument to validate_directory
args, kwargs = validate_directory.call_args
assert args[0] == '../my-path'
def test_it_fails_if_base_is_not_a_string(self, tmpdir):
apply_fs(tmpdir, minimal_config)
with tmpdir.as_cwd():
build = BuildConfigV1(
get_env_config(),
{'base': 1},
source_file=str(tmpdir.join('readthedocs.yml')),
)
with raises(InvalidConfig) as excinfo:
build.validate()
assert excinfo.value.key == 'base'
assert excinfo.value.code == INVALID_STRING
def test_it_fails_if_base_does_not_exist(self, tmpdir):
apply_fs(tmpdir, minimal_config)
build = BuildConfigV1(
get_env_config(),
{'base': 'docs'},
source_file=str(tmpdir.join('readthedocs.yml')),
)
with raises(InvalidConfig) as excinfo:
build.validate()
assert excinfo.value.key == 'base'
assert excinfo.value.code == INVALID_PATH
class TestValidateBuild:
class TestValidateBuild(object):
def test_it_fails_if_build_is_invalid_option(self, tmpdir):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, yaml_config_dir)
build = BuildConfigV1(
get_env_config(),
{},
{'build': {'image': 3.0}},
source_file=str(tmpdir.join('readthedocs.yml')),
)
@ -599,7 +474,7 @@ class TestValidateBuild:
assert excinfo.value.code == INVALID_CHOICE
def test_it_fails_on_python_validation(self, tmpdir):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, yaml_config_dir)
build = BuildConfigV1(
{},
{
@ -615,7 +490,7 @@ class TestValidateBuild:
assert excinfo.value.code == INVALID_CHOICE
def test_it_works_on_python_validation(self, tmpdir):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, yaml_config_dir)
build = BuildConfigV1(
{},
{
@ -628,9 +503,9 @@ class TestValidateBuild:
build.validate_python()
def test_it_works(self, tmpdir):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, yaml_config_dir)
build = BuildConfigV1(
get_env_config(),
{},
{'build': {'image': 'latest'}},
source_file=str(tmpdir.join('readthedocs.yml')),
)
@ -638,9 +513,9 @@ class TestValidateBuild:
assert build.build.image == 'readthedocs/build:latest'
def test_default(self, tmpdir):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, yaml_config_dir)
build = BuildConfigV1(
get_env_config(),
{},
{},
source_file=str(tmpdir.join('readthedocs.yml')),
)
@ -650,12 +525,12 @@ class TestValidateBuild:
@pytest.mark.parametrize(
'image', ['latest', 'readthedocs/build:3.0', 'rtd/build:latest'])
def test_it_priorities_image_from_env_config(self, tmpdir, image):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, yaml_config_dir)
defaults = {
'build_image': image,
}
build = BuildConfigV1(
get_env_config({'defaults': defaults}),
{'defaults': defaults},
{'build': {'image': 'latest'}},
source_file=str(tmpdir.join('readthedocs.yml')),
)
@ -664,7 +539,7 @@ class TestValidateBuild:
def test_use_conda_default_false():
build = get_build_config({}, get_env_config())
build = get_build_config({})
build.validate()
assert build.conda is None
@ -672,7 +547,6 @@ def test_use_conda_default_false():
def test_use_conda_respects_config():
build = get_build_config(
{'conda': {}},
get_env_config(),
)
build.validate()
assert isinstance(build.conda, Conda)
@ -682,7 +556,6 @@ def test_validates_conda_file(tmpdir):
apply_fs(tmpdir, {'environment.yml': ''})
build = get_build_config(
{'conda': {'file': 'environment.yml'}},
get_env_config(),
source_file=str(tmpdir.join('readthedocs.yml')),
)
build.validate()
@ -691,7 +564,7 @@ def test_validates_conda_file(tmpdir):
def test_requirements_file_empty():
build = get_build_config({}, get_env_config())
build = get_build_config({})
build.validate()
assert build.python.requirements is None
@ -703,7 +576,7 @@ def test_requirements_file_repects_default_value(tmpdir):
}
build = get_build_config(
{},
get_env_config({'defaults': defaults}),
{'defaults': defaults},
source_file=str(tmpdir.join('readthedocs.yml')),
)
build.validate()
@ -714,7 +587,6 @@ def test_requirements_file_respects_configuration(tmpdir):
apply_fs(tmpdir, {'requirements.txt': ''})
build = get_build_config(
{'requirements_file': 'requirements.txt'},
get_env_config(),
source_file=str(tmpdir.join('readthedocs.yml')),
)
build.validate()
@ -724,7 +596,6 @@ def test_requirements_file_respects_configuration(tmpdir):
def test_requirements_file_is_null(tmpdir):
build = get_build_config(
{'requirements_file': None},
get_env_config(),
source_file=str(tmpdir.join('readthedocs.yml')),
)
build.validate()
@ -734,7 +605,6 @@ def test_requirements_file_is_null(tmpdir):
def test_requirements_file_is_blank(tmpdir):
build = get_build_config(
{'requirements_file': ''},
get_env_config(),
source_file=str(tmpdir.join('readthedocs.yml')),
)
build.validate()
@ -742,7 +612,7 @@ def test_requirements_file_is_blank(tmpdir):
def test_build_validate_calls_all_subvalidators(tmpdir):
apply_fs(tmpdir, minimal_config)
apply_fs(tmpdir, {})
build = BuildConfigV1(
{},
{},
@ -750,28 +620,22 @@ def test_build_validate_calls_all_subvalidators(tmpdir):
)
with patch.multiple(
BuildConfigV1,
validate_base=DEFAULT,
validate_name=DEFAULT,
validate_python=DEFAULT,
validate_output_base=DEFAULT,
):
build.validate()
BuildConfigV1.validate_base.assert_called_with()
BuildConfigV1.validate_name.assert_called_with()
BuildConfigV1.validate_python.assert_called_with()
BuildConfigV1.validate_output_base.assert_called_with()
def test_load_calls_validate(tmpdir):
apply_fs(tmpdir, minimal_config_dir)
apply_fs(tmpdir, yaml_config_dir)
base = str(tmpdir)
with patch.object(BuildConfigV1, 'validate') as build_validate:
load(base, env_config)
load(base, {})
assert build_validate.call_count == 1
def test_raise_config_not_supported():
build = get_build_config({}, get_env_config())
build = get_build_config({})
build.validate()
with raises(ConfigOptionNotSupportedError) as excinfo:
build.redirects
@ -797,12 +661,12 @@ def test_as_dict(tmpdir):
},
'requirements_file': 'requirements.txt',
},
get_env_config({
{
'defaults': {
'doctype': 'sphinx',
'sphinx_configuration': None,
},
}),
},
source_file=str(tmpdir.join('readthedocs.yml')),
)
build.validate()
@ -840,7 +704,7 @@ def test_as_dict(tmpdir):
assert build.as_dict() == expected_dict
class TestBuildConfigV2:
class TestBuildConfigV2(object):
def get_build_config(
self, config, env_config=None, source_file='readthedocs.yml'):

View File

@ -1,57 +1,52 @@
# -*- coding: utf-8 -*-
"""URL configurations for subdomains."""
from __future__ import absolute_import
from functools import reduce
from operator import add
from django.conf import settings
from django.conf.urls import url
from django.conf import settings
from django.conf.urls.static import static
from readthedocs.constants import pattern_opts
from readthedocs.core.views import server_error_404, server_error_500
from readthedocs.core.views.serve import (
redirect_page_with_filename,
redirect_project_slug,
serve_docs,
redirect_project_slug, serve_docs, robots_txt,
)
from readthedocs.core.views import (
server_error_500,
server_error_404,
)
from readthedocs.constants import pattern_opts
handler500 = server_error_500
handler404 = server_error_404
subdomain_urls = [
url(
r'^(?:|projects/(?P<subproject_slug>{project_slug})/)'
url(r'robots.txt$', robots_txt, name='robots_txt'),
url(r'^(?:|projects/(?P<subproject_slug>{project_slug})/)'
r'page/(?P<filename>.*)$'.format(**pattern_opts),
redirect_page_with_filename,
name='docs_detail',
),
url(
(r'^(?:|projects/(?P<subproject_slug>{project_slug})/)$').format(
**pattern_opts
),
name='docs_detail'),
url((r'^(?:|projects/(?P<subproject_slug>{project_slug})/)$').format(**pattern_opts),
redirect_project_slug,
name='redirect_project_slug',
),
url(
(
r'^(?:|projects/(?P<subproject_slug>{project_slug})/)'
r'(?P<lang_slug>{lang_slug})/'
r'(?P<version_slug>{version_slug})/'
r'(?P<filename>{filename_slug})$'.format(**pattern_opts)
),
name='redirect_project_slug'),
url((r'^(?:|projects/(?P<subproject_slug>{project_slug})/)'
r'(?P<lang_slug>{lang_slug})/'
r'(?P<version_slug>{version_slug})/'
r'(?P<filename>{filename_slug})$'.format(**pattern_opts)),
serve_docs,
name='docs_detail',
),
name='docs_detail'),
]
groups = [subdomain_urls]
# Needed to serve media locally
if getattr(settings, 'DEBUG', False):
groups.insert(
0, static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
)
groups.insert(0, static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT))
urlpatterns = reduce(add, groups)

View File

@ -19,7 +19,7 @@ from django.views.generic import TemplateView
from readthedocs.builds.models import Version
from readthedocs.core.utils import broadcast
from readthedocs.projects.models import Project, ImportedFile
from readthedocs.projects.tasks import remove_dir
from readthedocs.projects.tasks import remove_dirs
from readthedocs.redirects.utils import get_redirect_response
log = logging.getLogger(__name__)
@ -89,7 +89,7 @@ def wipe_version(request, project_slug, version_slug):
os.path.join(version.project.doc_path, 'conda', version.slug),
]
for del_dir in del_dirs:
broadcast(type='build', task=remove_dir, args=[del_dir])
broadcast(type='build', task=remove_dirs, args=[(del_dir,)])
return redirect('project_version_list', project_slug)
return render(
request,

View File

@ -1,7 +1,12 @@
# -*- coding: utf-8 -*-
"""Views pertaining to builds."""
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
import json
import logging
import re
@ -16,7 +21,6 @@ from readthedocs.projects import constants
from readthedocs.projects.models import Feature, Project
from readthedocs.projects.tasks import sync_repository_task
log = logging.getLogger(__name__)
@ -43,14 +47,13 @@ def _build_version(project, slug, already_built=()):
version = project.versions.filter(active=True, slug=slug).first()
if version and slug not in already_built:
log.info(
'(Version build) Building %s:%s',
project.slug,
version.slug,
"(Version build) Building %s:%s",
project.slug, version.slug,
)
trigger_build(project=project, version=version, force=True)
return slug
log.info('(Version build) Not Building %s', slug)
log.info("(Version build) Not Building %s", slug)
return None
@ -67,11 +70,8 @@ def build_branches(project, branch_list):
for branch in branch_list:
versions = project.versions_from_branch_name(branch)
for version in versions:
log.info(
'(Branch Build) Processing %s:%s',
project.slug,
version.slug,
)
log.info("(Branch Build) Processing %s:%s",
project.slug, version.slug)
ret = _build_version(project, version.slug, already_built=to_build)
if ret:
to_build.add(ret)
@ -95,7 +95,9 @@ def sync_versions(project):
try:
version_identifier = project.get_default_branch()
version = (
project.versions.filter(identifier=version_identifier).first()
project.versions
.filter(identifier=version_identifier)
.first()
)
if not version:
log.info('Unable to sync from %s version', version_identifier)
@ -118,13 +120,10 @@ def get_project_from_url(url):
def log_info(project, msg):
log.info(
constants.LOG_TEMPLATE.format(
project=project,
version='',
msg=msg,
)
)
log.info(constants.LOG_TEMPLATE
.format(project=project,
version='',
msg=msg))
def _build_url(url, projects, branches):
@ -134,7 +133,7 @@ def _build_url(url, projects, branches):
Check each of the ``branches`` to see if they are active and should be
built.
"""
ret = ''
ret = ""
all_built = {}
all_not_building = {}
@ -157,19 +156,15 @@ def _build_url(url, projects, branches):
for project_slug, built in list(all_built.items()):
if built:
msg = '(URL Build) Build Started: {} [{}]'.format(
url,
' '.join(built),
)
msg = '(URL Build) Build Started: %s [%s]' % (
url, ' '.join(built))
log_info(project_slug, msg=msg)
ret += msg
for project_slug, not_building in list(all_not_building.items()):
if not_building:
msg = '(URL Build) Not Building: {} [{}]'.format(
url,
' '.join(not_building),
)
msg = '(URL Build) Not Building: %s [%s]' % (
url, ' '.join(not_building))
log_info(project_slug, msg=msg)
ret += msg
@ -203,8 +198,7 @@ def github_build(request): # noqa: D205
else:
data = json.loads(request.body)
http_url = data['repository']['url']
http_search_url = http_url.replace('http://',
'').replace('https://', '')
http_search_url = http_url.replace('http://', '').replace('https://', '')
ssh_url = data['repository']['ssh_url']
ssh_search_url = ssh_url.replace('git@', '').replace('.git', '')
branches = [data['ref'].replace('refs/heads/', '')]
@ -217,14 +211,14 @@ def github_build(request): # noqa: D205
log.info(
'GitHub webhook search: url=%s branches=%s',
http_search_url,
branches,
branches
)
ssh_projects = get_project_from_url(ssh_search_url)
if ssh_projects:
log.info(
'GitHub webhook search: url=%s branches=%s',
ssh_search_url,
branches,
branches
)
projects = repo_projects | ssh_projects
return _build_url(http_search_url, projects, branches)
@ -299,26 +293,24 @@ def bitbucket_build(request):
else:
data = json.loads(request.body)
version = 2 if request.META.get(
'HTTP_USER_AGENT'
) == 'Bitbucket-Webhooks/2.0' else 1
version = 2 if request.META.get('HTTP_USER_AGENT') == 'Bitbucket-Webhooks/2.0' else 1
if version == 1:
branches = [
commit.get('branch', '') for commit in data['commits']
]
branches = [commit.get('branch', '')
for commit in data['commits']]
repository = data['repository']
if not repository['absolute_url']:
return HttpResponse('Invalid request', status=400)
search_url = 'bitbucket.org{}'.format(
repository['absolute_url'].rstrip('/'),
search_url = 'bitbucket.org{0}'.format(
repository['absolute_url'].rstrip('/')
)
elif version == 2:
changes = data['push']['changes']
branches = [change['new']['name'] for change in changes]
branches = [change['new']['name']
for change in changes]
if not data['repository']['full_name']:
return HttpResponse('Invalid request', status=400)
search_url = 'bitbucket.org/{}'.format(
data['repository']['full_name'],
search_url = 'bitbucket.org/{0}'.format(
data['repository']['full_name']
)
except (TypeError, ValueError, KeyError):
log.exception('Invalid Bitbucket webhook payload')
@ -366,12 +358,10 @@ def generic_build(request, project_id_or_slug=None):
project = Project.objects.get(slug=project_id_or_slug)
except (Project.DoesNotExist, ValueError):
log.exception(
'(Incoming Generic Build) Repo not found: %s',
project_id_or_slug,
)
"(Incoming Generic Build) Repo not found: %s",
project_id_or_slug)
return HttpResponseNotFound(
'Repo not found: %s' % project_id_or_slug,
)
'Repo not found: %s' % project_id_or_slug)
# This endpoint doesn't require authorization, we shouldn't allow builds to
# be triggered from this any longer. Deprecation plan is to selectively
# allow access to this endpoint for now.
@ -380,11 +370,11 @@ def generic_build(request, project_id_or_slug=None):
if request.method == 'POST':
slug = request.POST.get('version_slug', project.default_version)
log.info(
'(Incoming Generic Build) %s [%s]',
"(Incoming Generic Build) %s [%s]",
project.slug,
slug,
)
_build_version(project, slug)
else:
return HttpResponse('You must POST to this resource.')
return HttpResponse("You must POST to this resource.")
return redirect('builds_project_list', project.slug)

View File

@ -1,5 +1,4 @@
# -*- coding: utf-8 -*-
"""
Doc serving from Python.
@ -26,14 +25,19 @@ PYTHON_MEDIA (False) - Set this to True to serve docs & media from Python
SERVE_DOCS (['private']) - The list of ['private', 'public'] docs to serve.
"""
from __future__ import (
absolute_import, division, print_function, unicode_literals)
import logging
import mimetypes
import os
from functools import wraps
from django.conf import settings
from django.http import Http404, HttpResponse, HttpResponseRedirect
from django.shortcuts import get_object_or_404, render
from django.http import HttpResponse, HttpResponseRedirect, Http404
from django.shortcuts import get_object_or_404
from django.shortcuts import render
from django.utils.encoding import iri_to_uri
from django.views.static import serve
from readthedocs.builds.models import Version
@ -43,7 +47,6 @@ from readthedocs.core.symlink import PrivateSymlink, PublicSymlink
from readthedocs.projects import constants
from readthedocs.projects.models import Project, ProjectRelationship
log = logging.getLogger(__name__)
@ -55,11 +58,8 @@ def map_subproject_slug(view_func):
.. warning:: Does not take into account any kind of privacy settings.
"""
@wraps(view_func)
def inner_view( # noqa
request, subproject=None, subproject_slug=None, *args, **kwargs
):
def inner_view(request, subproject=None, subproject_slug=None, *args, **kwargs): # noqa
if subproject is None and subproject_slug:
# Try to fetch by subproject alias first, otherwise we might end up
# redirected to an unrelated project.
@ -85,11 +85,8 @@ def map_project_slug(view_func):
.. warning:: Does not take into account any kind of privacy settings.
"""
@wraps(view_func)
def inner_view( # noqa
request, project=None, project_slug=None, *args, **kwargs
):
def inner_view(request, project=None, project_slug=None, *args, **kwargs): # noqa
if project is None:
if not project_slug:
project_slug = request.slug
@ -114,14 +111,13 @@ def redirect_project_slug(request, project, subproject): # pylint: disable=unus
def redirect_page_with_filename(request, project, subproject, filename): # pylint: disable=unused-argument # noqa
"""Redirect /page/file.html to /en/latest/file.html."""
return HttpResponseRedirect(
resolve(subproject or project, filename=filename),
)
resolve(subproject or project, filename=filename))
def _serve_401(request, project):
res = render(request, '401.html')
res.status_code = 401
log.debug('Unauthorized access to {} documentation'.format(project.slug))
log.debug('Unauthorized access to {0} documentation'.format(project.slug))
return res
@ -133,17 +129,23 @@ def _serve_file(request, filename, basepath):
# Serve from Nginx
content_type, encoding = mimetypes.guess_type(
os.path.join(basepath, filename),
)
os.path.join(basepath, filename))
content_type = content_type or 'application/octet-stream'
response = HttpResponse(content_type=content_type)
if encoding:
response['Content-Encoding'] = encoding
try:
response['X-Accel-Redirect'] = os.path.join(
iri_path = os.path.join(
basepath[len(settings.SITE_ROOT):],
filename,
)
# NGINX does not support non-ASCII characters in the header, so we
# convert the IRI path to URI so it's compatible with what NGINX expects
# as the header value.
# https://github.com/benoitc/gunicorn/issues/1448
# https://docs.djangoproject.com/en/1.11/ref/unicode/#uri-and-iri-handling
x_accel_redirect = iri_to_uri(iri_path)
response['X-Accel-Redirect'] = x_accel_redirect
except UnicodeEncodeError:
raise Http404
@ -153,14 +155,9 @@ def _serve_file(request, filename, basepath):
@map_project_slug
@map_subproject_slug
def serve_docs(
request,
project,
subproject,
lang_slug=None,
version_slug=None,
filename='',
):
"""Map existing proj, lang, version, filename views to the file format."""
request, project, subproject, lang_slug=None, version_slug=None,
filename=''):
"""Exists to map existing proj, lang, version, filename views to the file format."""
if not version_slug:
version_slug = project.get_default_version()
try:
@ -225,5 +222,50 @@ def _serve_symlink_docs(request, project, privacy_level, filename=''):
files_tried.append(os.path.join(basepath, filename))
raise Http404(
'File not found. Tried these files: %s' % ','.join(files_tried),
'File not found. Tried these files: %s' % ','.join(files_tried))
@map_project_slug
def robots_txt(request, project):
"""
Serve custom user's defined ``/robots.txt``.
If the user added a ``robots.txt`` in the "default version" of the project,
we serve it directly.
"""
# Use the ``robots.txt`` file from the default version configured
version_slug = project.get_default_version()
version = project.versions.get(slug=version_slug)
no_serve_robots_txt = any([
# If project is private or,
project.privacy_level == constants.PRIVATE,
# default version is private or,
version.privacy_level == constants.PRIVATE,
# default version is not active or,
not version.active,
# default version is not built
not version.built,
])
if no_serve_robots_txt:
# ... we do return a 404
raise Http404()
filename = resolve_path(
project,
version_slug=version_slug,
filename='robots.txt',
subdomain=True, # subdomain will make it a "full" path without a URL prefix
)
# This breaks path joining, by ignoring the root when given an "absolute" path
if filename[0] == '/':
filename = filename[1:]
basepath = PublicSymlink(project).project_root
fullpath = os.path.join(basepath, filename)
if os.path.exists(fullpath):
return HttpResponse(open(fullpath).read(), content_type='text/plain')
return HttpResponse('User-agent: *\nAllow: /\n', content_type='text/plain')

View File

@ -84,7 +84,7 @@ class BaseMkdocs(BaseBuilder):
"""
Load a YAML config.
Raise BuildEnvironmentError if failed due to syntax errors.
:raises: ``MkDocsYAMLParseError`` if failed due to syntax errors.
"""
try:
return yaml.safe_load(open(self.yaml_file, 'r'),)
@ -105,7 +105,12 @@ class BaseMkdocs(BaseBuilder):
)
def append_conf(self, **__):
"""Set mkdocs config values."""
"""
Set mkdocs config values.
:raises: ``MkDocsYAMLParseError`` if failed due to known type errors
(i.e. expecting a list and a string is found).
"""
if not self.yaml_file:
self.yaml_file = os.path.join(self.root_path, 'mkdocs.yml')
@ -113,12 +118,27 @@ class BaseMkdocs(BaseBuilder):
# Handle custom docs dirs
user_docs_dir = user_config.get('docs_dir')
if not isinstance(user_docs_dir, (type(None), str)):
raise MkDocsYAMLParseError(
MkDocsYAMLParseError.INVALID_DOCS_DIR_CONFIG,
)
docs_dir = self.docs_dir(docs_dir=user_docs_dir)
self.create_index(extension='md')
user_config['docs_dir'] = docs_dir
# Set mkdocs config values
static_url = get_absolute_static_url()
for config in ('extra_css', 'extra_javascript'):
user_value = user_config.get(config, [])
if not isinstance(user_value, list):
raise MkDocsYAMLParseError(
MkDocsYAMLParseError.INVALID_EXTRA_CONFIG.format(
config=config,
),
)
user_config.setdefault('extra_javascript', []).extend([
'readthedocs-data.js',
'%score/js/readthedocs-doc-embed.js' % static_url,

View File

@ -41,8 +41,6 @@ def load_yaml_config(version):
'build': {
'image': img_name,
},
'output_base': '',
'name': version.slug,
'defaults': {
'install_project': project.install_project,
'formats': get_default_formats(project),
@ -57,7 +55,6 @@ def load_yaml_config(version):
img_settings = DOCKER_IMAGE_SETTINGS.get(img_name, None)
if img_settings:
env_config.update(img_settings)
env_config['DOCKER_IMAGE_SETTINGS'] = img_settings
try:
config = load_config(

View File

@ -23,6 +23,7 @@ from readthedocs.builds.constants import BUILD_STATE_FINISHED
from readthedocs.builds.models import BuildCommandResultMixin
from readthedocs.core.utils import slugify
from readthedocs.projects.constants import LOG_TEMPLATE
from readthedocs.projects.models import Feature
from readthedocs.restapi.client import api as api_v2
from .constants import (
@ -759,10 +760,18 @@ class DockerBuildEnvironment(BuildEnvironment):
project_name=self.project.slug,
)[:DOCKER_HOSTNAME_MAX_LEN],
)
# Decide what Docker image to use, based on priorities:
# Use the Docker image set by our feature flag: ``testing`` or,
if self.project.has_feature(Feature.USE_TESTING_BUILD_IMAGE):
self.container_image = 'readthedocs/build:testing'
# the image set by user or,
if self.config and self.config.build.image:
self.container_image = self.config.build.image
# the image overridden by the project (manually set by an admin).
if self.project.container_image:
self.container_image = self.project.container_image
if self.project.container_mem_limit:
self.container_mem_limit = self.project.container_mem_limit
if self.project.container_time_limit:

View File

@ -60,3 +60,13 @@ class MkDocsYAMLParseError(BuildEnvironmentError):
GENERIC_WITH_PARSE_EXCEPTION = ugettext_noop(
'Problem parsing MkDocs YAML configuration. {exception}',
)
INVALID_DOCS_DIR_CONFIG = ugettext_noop(
'The "docs_dir" config from your MkDocs YAML config file has to be a '
'string with relative or absolute path.',
)
INVALID_EXTRA_CONFIG = ugettext_noop(
'The "{config}" config from your MkDocs YAML config file has to be a '
'a list of relative paths.',
)

View File

@ -75,8 +75,9 @@ class PythonEnvironment:
','.join(self.config.python.extra_requirements),
)
self.build_env.run(
'python',
self.venv_bin(filename='pip'),
self.venv_bin(filename='python'),
'-m',
'pip',
'install',
'--ignore-installed',
'--cache-dir',
@ -87,7 +88,7 @@ class PythonEnvironment:
)
elif self.config.python.install_with_setup:
self.build_env.run(
'python',
self.venv_bin(filename='python'),
'setup.py',
'install',
'--force',
@ -237,8 +238,9 @@ class Virtualenv(PythonEnvironment):
def install_core_requirements(self):
"""Install basic Read the Docs requirements into the virtualenv."""
pip_install_cmd = [
'python',
self.venv_bin(filename='pip'),
self.venv_bin(filename='python'),
'-m',
'pip',
'install',
'--upgrade',
'--cache-dir',
@ -318,8 +320,9 @@ class Virtualenv(PythonEnvironment):
if requirements_file_path:
args = [
'python',
self.venv_bin(filename='pip'),
self.venv_bin(filename='python'),
'-m',
'pip',
'install',
]
if self.project.has_feature(Feature.PIP_ALWAYS_UPGRADE):
@ -367,6 +370,7 @@ class Conda(PythonEnvironment):
'conda',
'env',
'create',
'--quiet',
'--name',
self.version.slug,
'--file',
@ -398,6 +402,7 @@ class Conda(PythonEnvironment):
'conda',
'install',
'--yes',
'--quiet',
'--name',
self.version.slug,
]
@ -408,8 +413,9 @@ class Conda(PythonEnvironment):
)
pip_cmd = [
'python',
self.venv_bin(filename='pip'),
self.venv_bin(filename='python'),
'-m',
'pip',
'install',
'-U',
'--cache-dir',

View File

@ -33,16 +33,16 @@ $(document).ready(function () {
{% endblock %}
{% block edit_content %}
<div>
<h2>Read the Docs Gold</h2>
<div>
<h2>Read the Docs Gold</h2>
<p>
{% blocktrans trimmed %}
Supporting Read the Docs lets us work more on features that people love.
Your money will go directly to maintenance and development of the
product.
{% endblocktrans %}
</p>
<p>
{% blocktrans trimmed %}
Supporting Read the Docs lets us work more on features that people love.
Your money will go directly to maintenance and development of the
product.
{% endblocktrans %}
</p>
<p>
{% blocktrans trimmed %}
If you are an individual,
@ -60,85 +60,86 @@ $(document).ready(function () {
{% endblocktrans %}
</p>
<p>{% trans 'Becoming a Gold Member also makes Read the Docs ad-free for as long as you are logged-in.' %}</p>
<p>{% trans 'Becoming a Gold Member also makes Read the Docs ad-free for as long as you are logged-in.' %}</p>
<p>
{% blocktrans trimmed %}
You can also make one-time donations on our <a href="https://readthedocs.org/sustainability/">sustainability</a> page.
{% endblocktrans %}
</p>
<p>
{% blocktrans trimmed %}
You can also make one-time donations on our <a href="https://readthedocs.org/sustainability/">sustainability</a> page.
{% endblocktrans %}
</p>
{% if domains.count %}
<h3>Domains</h2>
<p>
{% blocktrans trimmed %}
We ask that folks who use custom Domains give Read the Docs $5 per domain they are using.
This is optional, but it really does help us maintain the site going forward.
{% endblocktrans %}
</p>
{% if domains.count %}
<h3>Domains</h3>
<p>
{% blocktrans trimmed %}
We ask that folks who use custom Domains give Read the Docs $5 per domain they are using.
This is optional, but it really does help us maintain the site going forward.
{% endblocktrans %}
</p>
<p>
You are currently using {{ domains.count }} domains:
<p>
You are currently using {{ domains.count }} domains:
<ul class="donate-about">
{% for domain in domains %}
<li>{{ domain.domain }} ({{ domain.project.name }})</li>
{% endfor %}
</ul>
</p>
<ul class="donate-about">
{% for domain in domains %}
<li>{{ domain.domain }} ({{ domain.project.name }})</li>
{% endfor %}
</ul>
</p>
{% endif %}
{% trans "Become a Gold Member" as subscription_title %}
{% if golduser %}
{% trans "Update Your Subscription" as subscription_title %}
{% endif %}
<h3>{{ subscription_title }}</h3>
<form accept-charset="UTF-8" action="" method="post" id="gold-register" class="payment">
{% csrf_token %}
{{ form.non_field_errors }}
{% for field in form.fields_with_cc_group %}
{% if field.is_cc_group %}
<p
data-bind="visible: last_4_card_digits"
style="display: none;"
class="subscription-card">
<label>{% trans "Current card" %}:</label>
<span class="subscription-card-number">
****-<span data-bind="text: last_4_card_digits"></span>
</span>
</p>
<div data-bind="visible: !show_card_form()">
<a
href="#"
data-bind="click: function () { is_editing_card(true); }"
class="subscription-edit-link">
{% trans "Edit Card" %}
</a>
</div>
<div
class="subscription-card"
data-bind="visible: show_card_form"
style="display: none;">
{% for groupfield in field.fields %}
{% include 'core/ko_form_field.html' with field=groupfield %}
{% endfor %}
</div>
{% else %}
{% include 'core/ko_form_field.html' with field=field %}
{% endif %}
{% endfor %}
{% trans "Sign Up" as form_submit_text %}
{% if golduser %}
{% trans "Update Subscription" as form_submit_text %}
{% endif %}
<input type="submit" value="{{ form_submit_text }}" data-bind="click: process_full_form" />
<em>{% trans "All information is submitted directly to Stripe." %}</em>
</form>
{% trans "Become a Gold Member" as subscription_title %}
{% if golduser %}
{% trans "Update Your Subscription" as subscription_title %}
{% endif %}
<h3>{{ subscription_title }}</h3>
<form accept-charset="UTF-8" action="" method="post" id="gold-register" class="payment">
{% csrf_token %}
{{ form.non_field_errors }}
{% for field in form.fields_with_cc_group %}
{% if field.is_cc_group %}
<p
data-bind="visible: last_4_card_digits"
style="display: none;"
class="subscription-card">
<label>{% trans "Current card" %}:</label>
<span class="subscription-card-number">
****-<span data-bind="text: last_4_card_digits"></span>
</span>
</p>
<div data-bind="visible: !show_card_form()">
<a
href="#"
data-bind="click: function () { is_editing_card(true); }"
class="subscription-edit-link">
{% trans "Edit Card" %}
</a>
</div>
<div
class="subscription-card"
data-bind="visible: show_card_form"
style="display: none;">
{% for groupfield in field.fields %}
{% include 'core/ko_form_field.html' with field=groupfield %}
{% endfor %}
</div>
{% else %}
{% include 'core/ko_form_field.html' with field=field %}
{% endif %}
{% endfor %}
{% trans "Sign Up" as form_submit_text %}
{% if golduser %}
{% trans "Update Subscription" as form_submit_text %}
{% endif %}
<input type="submit" value="{{ form_submit_text }}" data-bind="click: process_full_form" />
<em>{% trans "All information is submitted directly to Stripe." %}</em>
</form>
</div>
{% endblock %}

View File

@ -29,11 +29,7 @@ def send_notification(request, notification):
backends = getattr(settings, 'NOTIFICATION_BACKENDS', [])
for cls_name in backends:
backend = import_string(cls_name)(request)
# Do not send email notification if defined explicitly
if backend.name == EmailBackend.name and not notification.send_email:
pass
else:
backend.send(notification)
backend.send(notification)
class Backend:
@ -52,11 +48,16 @@ class EmailBackend(Backend):
The content body is first rendered from an on-disk template, then passed
into the standard email templates as a string.
If the notification is set to ``send_email=False``, this backend will exit
early from :py:meth:`send`.
"""
name = 'email'
def send(self, notification):
if not notification.send_email:
return
# FIXME: if the level is an ERROR an email is received and sometimes
# it's not necessary. This behavior should be clearly documented in the
# code
@ -111,6 +112,6 @@ class SiteBackend(Backend):
backend_name=self.name,
source_format=HTML,
),
extra_tags='',
extra_tags=notification.extra_tags,
user=notification.user,
)

View File

@ -35,6 +35,7 @@ class Notification:
subject = None
user = None
send_email = True
extra_tags = ''
def __init__(self, context_object, request, user=None):
self.object = context_object

View File

@ -220,6 +220,14 @@ class GitHubService(Service):
project,
)
return (True, resp)
if resp.status_code in [401, 403, 404]:
log.info(
'GitHub project does not exist or user does not have '
'permissions: project=%s',
project,
)
return (False, resp)
# Catch exceptions with request or deserializing JSON
except (RequestException, ValueError):
log.exception(

View File

@ -1,7 +1,13 @@
# -*- coding: utf-8 -*-
"""Django administration interface for `projects.models`"""
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
from django.contrib import admin, messages
from django.contrib.admin.actions import delete_selected
from django.utils.translation import ugettext_lazy as _
@ -24,12 +30,20 @@ from .models import (
ProjectRelationship,
WebHook,
)
from .notifications import ResourceUsageNotification
from .tasks import remove_dir
from .notifications import (
DeprecatedBuildWebhookNotification,
DeprecatedGitHubWebhookNotification,
ResourceUsageNotification,
)
from .tasks import remove_dirs
class ProjectSendNotificationView(SendNotificationView):
notification_classes = [ResourceUsageNotification]
notification_classes = [
ResourceUsageNotification,
DeprecatedBuildWebhookNotification,
DeprecatedGitHubWebhookNotification,
]
def get_object_recipients(self, obj):
for owner in obj.users.all():
@ -95,7 +109,9 @@ class ProjectOwnerBannedFilter(admin.SimpleListFilter):
OWNER_BANNED = 'true'
def lookups(self, request, model_admin):
return ((self.OWNER_BANNED, _('Yes')),)
return (
(self.OWNER_BANNED, _('Yes')),
)
def queryset(self, request, queryset):
if self.value() == self.OWNER_BANNED:
@ -109,22 +125,13 @@ class ProjectAdmin(GuardedModelAdmin):
prepopulated_fields = {'slug': ('name',)}
list_display = ('name', 'slug', 'repo', 'repo_type', 'featured')
list_filter = (
'repo_type',
'featured',
'privacy_level',
'documentation_type',
'programming_language',
ProjectOwnerBannedFilter,
)
list_filter = ('repo_type', 'featured', 'privacy_level',
'documentation_type', 'programming_language',
'feature__feature_id', ProjectOwnerBannedFilter)
list_editable = ('featured',)
search_fields = ('slug', 'repo')
inlines = [
ProjectRelationshipInline,
RedirectInline,
VersionInline,
DomainInline,
]
inlines = [ProjectRelationshipInline, RedirectInline,
VersionInline, DomainInline]
readonly_fields = ('feature_flags',)
raw_id_fields = ('users', 'main_language_project')
actions = ['send_owner_email', 'ban_owner']
@ -134,7 +141,7 @@ class ProjectAdmin(GuardedModelAdmin):
def send_owner_email(self, request, queryset):
view = ProjectSendNotificationView.as_view(
action_name='send_owner_email',
action_name='send_owner_email'
)
return view(request, queryset=queryset)
@ -151,25 +158,18 @@ class ProjectAdmin(GuardedModelAdmin):
total = 0
for project in queryset:
if project.users.count() == 1:
count = (
UserProfile.objects.filter(user__projects=project
).update(banned=True)
)
count = (UserProfile.objects
.filter(user__projects=project)
.update(banned=True))
total += count
else:
messages.add_message(
request,
messages.ERROR,
'Project has multiple owners: {}'.format(project),
)
messages.add_message(request, messages.ERROR,
'Project has multiple owners: {0}'.format(project))
if total == 0:
messages.add_message(request, messages.ERROR, 'No users banned')
else:
messages.add_message(
request,
messages.INFO,
'Banned {} user(s)'.format(total),
)
messages.add_message(request, messages.INFO,
'Banned {0} user(s)'.format(total))
ban_owner.short_description = 'Ban project owner'
@ -182,15 +182,19 @@ class ProjectAdmin(GuardedModelAdmin):
"""
if request.POST.get('post'):
for project in queryset:
broadcast(type='app', task=remove_dir, args=[project.doc_path])
broadcast(
type='app',
task=remove_dirs,
args=[(project.doc_path,)],
)
return delete_selected(self, request, queryset)
def get_actions(self, request):
actions = super().get_actions(request)
actions = super(ProjectAdmin, self).get_actions(request)
actions['delete_selected'] = (
self.__class__.delete_selected_and_artifacts,
'delete_selected',
delete_selected.short_description,
delete_selected.short_description
)
return actions

View File

@ -13,7 +13,6 @@ from django.utils.translation import ugettext_lazy as _
DOCUMENTATION_CHOICES = (
('auto', _('Automatically Choose')),
('sphinx', _('Sphinx Html')),
('mkdocs', _('Mkdocs (Markdown)')),
('sphinx_htmldir', _('Sphinx HtmlDir')),

View File

@ -44,6 +44,10 @@ class RepositoryError(BuildEnvironmentError):
'You can not have two versions with the name latest or stable.',
)
FAILED_TO_CHECKOUT = _(
'Failed to checkout revision: {}'
)
def get_default_message(self):
if settings.ALLOW_PRIVATE_REPOS:
return self.PRIVATE_ALLOWED

View File

@ -2,15 +2,35 @@
"""Project forms."""
from random import choice
from urllib.parse import urlparse
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
try:
# TODO: remove this when we deprecate Python2
# re.fullmatch is >= Py3.4 only
from re import fullmatch
except ImportError:
# https://stackoverflow.com/questions/30212413/backport-python-3-4s-regular-expression-fullmatch-to-python-2
import re
def fullmatch(regex, string, flags=0):
"""Emulate python-3.4 re.fullmatch().""" # noqa
return re.match("(?:" + regex + r")\Z", string, flags=flags)
from random import choice
from builtins import object
from django import forms
from django.conf import settings
from django.contrib.auth.models import User
from django.template.loader import render_to_string
from django.utils.safestring import mark_safe
from django.utils.translation import ugettext_lazy as _
from future.backports.urllib.parse import urlparse
from guardian.shortcuts import assign
from textclassifier.validators import ClassifierValidator
@ -24,6 +44,7 @@ from readthedocs.projects.exceptions import ProjectSpamError
from readthedocs.projects.models import (
Domain,
EmailHook,
EnvironmentVariable,
Feature,
Project,
ProjectRelationship,
@ -44,17 +65,17 @@ class ProjectForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
self.user = kwargs.pop('user', None)
super().__init__(*args, **kwargs)
super(ProjectForm, self).__init__(*args, **kwargs)
def save(self, commit=True):
project = super().save(commit)
project = super(ProjectForm, self).save(commit)
if commit:
if self.user and not project.users.filter(pk=self.user.pk).exists():
project.users.add(self.user)
return project
class ProjectTriggerBuildMixin:
class ProjectTriggerBuildMixin(object):
"""
Mixin to trigger build on form save.
@ -65,7 +86,7 @@ class ProjectTriggerBuildMixin:
def save(self, commit=True):
"""Trigger build on commit save."""
project = super().save(commit)
project = super(ProjectTriggerBuildMixin, self).save(commit)
if commit:
trigger_build(project=project)
return project
@ -82,7 +103,7 @@ class ProjectBasicsForm(ProjectForm):
"""Form for basic project fields."""
class Meta:
class Meta(object):
model = Project
fields = ('name', 'repo', 'repo_type')
@ -93,7 +114,7 @@ class ProjectBasicsForm(ProjectForm):
def __init__(self, *args, **kwargs):
show_advanced = kwargs.pop('show_advanced', False)
super().__init__(*args, **kwargs)
super(ProjectBasicsForm, self).__init__(*args, **kwargs)
if show_advanced:
self.fields['advanced'] = forms.BooleanField(
required=False,
@ -104,7 +125,7 @@ class ProjectBasicsForm(ProjectForm):
def save(self, commit=True):
"""Add remote repository relationship to the project instance."""
instance = super().save(commit)
instance = super(ProjectBasicsForm, self).save(commit)
remote_repo = self.cleaned_data.get('remote_repository', None)
if remote_repo:
if commit:
@ -120,11 +141,12 @@ class ProjectBasicsForm(ProjectForm):
potential_slug = slugify(name)
if Project.objects.filter(slug=potential_slug).exists():
raise forms.ValidationError(
_('Invalid project name, a project already exists with that name'),
) # yapf: disable # noqa
_('Invalid project name, a project already exists with that name')) # yapf: disable # noqa
if not potential_slug:
# Check the generated slug won't be empty
raise forms.ValidationError(_('Invalid project name'),)
raise forms.ValidationError(
_('Invalid project name'),
)
return name
@ -156,7 +178,7 @@ class ProjectExtraForm(ProjectForm):
"""Additional project information form."""
class Meta:
class Meta(object):
model = Project
fields = (
'description',
@ -178,9 +200,7 @@ class ProjectExtraForm(ProjectForm):
for tag in tags:
if len(tag) > 100:
raise forms.ValidationError(
_(
'Length of each tag must be less than or equal to 100 characters.'
),
_('Length of each tag must be less than or equal to 100 characters.')
)
return tags
@ -192,13 +212,11 @@ class ProjectAdvancedForm(ProjectTriggerBuildMixin, ProjectForm):
python_interpreter = forms.ChoiceField(
choices=constants.PYTHON_CHOICES,
initial='python',
help_text=_(
'The Python interpreter used to create the virtual '
'environment.',
),
help_text=_('The Python interpreter used to create the virtual '
'environment.'),
)
class Meta:
class Meta(object):
model = Project
fields = (
# Standard build edits
@ -222,44 +240,35 @@ class ProjectAdvancedForm(ProjectTriggerBuildMixin, ProjectForm):
)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
super(ProjectAdvancedForm, self).__init__(*args, **kwargs)
default_choice = (None, '-' * 9)
all_versions = self.instance.versions.values_list(
'identifier',
'verbose_name',
'identifier', 'verbose_name'
)
self.fields['default_branch'].widget = forms.Select(
choices=[default_choice] + list(all_versions),
choices=[default_choice] + list(all_versions)
)
active_versions = self.instance.all_active_versions().values_list(
'slug',
'verbose_name',
'slug', 'verbose_name'
)
self.fields['default_version'].widget = forms.Select(
choices=active_versions,
choices=active_versions
)
def clean_conf_py_file(self):
filename = self.cleaned_data.get('conf_py_file', '').strip()
if filename and 'conf.py' not in filename:
raise forms.ValidationError(
_(
'Your configuration file is invalid, make sure it contains '
'conf.py in it.',
),
) # yapf: disable
_('Your configuration file is invalid, make sure it contains '
'conf.py in it.')) # yapf: disable
return filename
class UpdateProjectForm(
ProjectTriggerBuildMixin,
ProjectBasicsForm,
ProjectExtraForm,
):
class Meta:
class UpdateProjectForm(ProjectTriggerBuildMixin, ProjectBasicsForm,
ProjectExtraForm):
class Meta(object):
model = Project
fields = (
# Basics
@ -281,26 +290,27 @@ class UpdateProjectForm(
if project:
msg = _(
'There is already a "{lang}" translation '
'for the {proj} project.',
'for the {proj} project.'
)
if project.translations.filter(language=language).exists():
raise forms.ValidationError(
msg.format(lang=language, proj=project.slug),
msg.format(lang=language, proj=project.slug)
)
main_project = project.main_language_project
if main_project:
if main_project.language == language:
raise forms.ValidationError(
msg.format(lang=language, proj=main_project.slug),
msg.format(lang=language, proj=main_project.slug)
)
siblings = (
main_project.translations.filter(language=language
).exclude(pk=project.pk
).exists()
main_project.translations
.filter(language=language)
.exclude(pk=project.pk)
.exists()
)
if siblings:
raise forms.ValidationError(
msg.format(lang=language, proj=main_project.slug),
msg.format(lang=language, proj=main_project.slug)
)
return language
@ -311,20 +321,18 @@ class ProjectRelationshipBaseForm(forms.ModelForm):
parent = forms.CharField(widget=forms.HiddenInput(), required=False)
class Meta:
class Meta(object):
model = ProjectRelationship
fields = '__all__'
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project')
self.user = kwargs.pop('user')
super().__init__(*args, **kwargs)
super(ProjectRelationshipBaseForm, self).__init__(*args, **kwargs)
# Don't display the update form with an editable child, as it will be
# filtered out from the queryset anyways.
if hasattr(self, 'instance') and self.instance.pk is not None:
self.fields['child'].queryset = Project.objects.filter(
pk=self.instance.child.pk
)
self.fields['child'].queryset = Project.objects.filter(pk=self.instance.child.pk)
else:
self.fields['child'].queryset = self.get_subproject_queryset()
@ -333,16 +341,14 @@ class ProjectRelationshipBaseForm(forms.ModelForm):
# This validation error is mostly for testing, users shouldn't see
# this in normal circumstances
raise forms.ValidationError(
_('Subproject nesting is not supported'),
)
_('Subproject nesting is not supported'))
return self.project
def clean_child(self):
child = self.cleaned_data['child']
if child == self.project:
raise forms.ValidationError(
_('A project can not be a subproject of itself'),
)
_('A project can not be a subproject of itself'))
return child
def get_subproject_queryset(self):
@ -353,10 +359,10 @@ class ProjectRelationshipBaseForm(forms.ModelForm):
project, or are a superproject, as neither case is supported.
"""
queryset = (
Project.objects.for_admin_user(self.user).exclude(
subprojects__isnull=False
).exclude(superprojects__isnull=False).exclude(pk=self.project.pk)
)
Project.objects.for_admin_user(self.user)
.exclude(subprojects__isnull=False)
.exclude(superprojects__isnull=False)
.exclude(pk=self.project.pk))
return queryset
@ -369,11 +375,11 @@ class DualCheckboxWidget(forms.CheckboxInput):
"""Checkbox with link to the version's built documentation."""
def __init__(self, version, attrs=None, check_test=bool):
super().__init__(attrs, check_test)
super(DualCheckboxWidget, self).__init__(attrs, check_test)
self.version = version
def render(self, name, value, attrs=None, renderer=None):
checkbox = super().render(name, value, attrs, renderer)
checkbox = super(DualCheckboxWidget, self).render(name, value, attrs, renderer)
icon = self.render_icon()
return mark_safe('{}{}'.format(checkbox, icon))
@ -461,14 +467,12 @@ def build_versions_form(project):
class BaseUploadHTMLForm(forms.Form):
content = forms.FileField(label=_('Zip file of HTML'))
overwrite = forms.BooleanField(
required=False,
label=_('Overwrite existing HTML?'),
)
overwrite = forms.BooleanField(required=False,
label=_('Overwrite existing HTML?'))
def __init__(self, *args, **kwargs):
self.request = kwargs.pop('request', None)
super().__init__(*args, **kwargs)
super(BaseUploadHTMLForm, self).__init__(*args, **kwargs)
def clean(self):
version_slug = self.cleaned_data['version']
@ -508,15 +512,14 @@ class UserForm(forms.Form):
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(UserForm, self).__init__(*args, **kwargs)
def clean_user(self):
name = self.cleaned_data['user']
user_qs = User.objects.filter(username=name)
if not user_qs.exists():
raise forms.ValidationError(
_('User {name} does not exist').format(name=name),
)
_('User {name} does not exist').format(name=name))
self.user = user_qs[0]
return name
@ -535,13 +538,11 @@ class EmailHookForm(forms.Form):
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(EmailHookForm, self).__init__(*args, **kwargs)
def clean_email(self):
self.email = EmailHook.objects.get_or_create(
email=self.cleaned_data['email'],
project=self.project,
)[0]
email=self.cleaned_data['email'], project=self.project)[0]
return self.email
def save(self):
@ -555,13 +556,11 @@ class WebHookForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(WebHookForm, self).__init__(*args, **kwargs)
def save(self, commit=True):
self.webhook = WebHook.objects.get_or_create(
url=self.cleaned_data['url'],
project=self.project,
)[0]
url=self.cleaned_data['url'], project=self.project)[0]
self.project.webhook_notifications.add(self.webhook)
return self.project
@ -579,17 +578,15 @@ class TranslationBaseForm(forms.Form):
def __init__(self, *args, **kwargs):
self.parent = kwargs.pop('parent', None)
self.user = kwargs.pop('user')
super().__init__(*args, **kwargs)
super(TranslationBaseForm, self).__init__(*args, **kwargs)
self.fields['project'].choices = self.get_choices()
def get_choices(self):
return [(
project.slug,
'{project} ({lang})'.format(
project=project.slug,
lang=project.get_language_display(),
),
) for project in self.get_translation_queryset().all()]
return [
(project.slug, '{project} ({lang})'.format(
project=project.slug, lang=project.get_language_display()))
for project in self.get_translation_queryset().all()
]
def clean_project(self):
translation_project_slug = self.cleaned_data['project']
@ -598,31 +595,36 @@ class TranslationBaseForm(forms.Form):
if self.parent.main_language_project is not None:
msg = 'Project "{project}" is already a translation'
raise forms.ValidationError(
(_(msg).format(project=self.parent.slug)),
(_(msg).format(project=self.parent.slug))
)
project_translation_qs = self.get_translation_queryset().filter(
slug=translation_project_slug,
slug=translation_project_slug
)
if not project_translation_qs.exists():
msg = 'Project "{project}" does not exist.'
raise forms.ValidationError(
(_(msg).format(project=translation_project_slug)),
(_(msg).format(project=translation_project_slug))
)
self.translation = project_translation_qs.first()
if self.translation.language == self.parent.language:
msg = ('Both projects can not have the same language ({lang}).')
msg = (
'Both projects can not have the same language ({lang}).'
)
raise forms.ValidationError(
_(msg).format(lang=self.parent.get_language_display()),
_(msg).format(lang=self.parent.get_language_display())
)
exists_translation = (
self.parent.translations.filter(language=self.translation.language
).exists()
self.parent.translations
.filter(language=self.translation.language)
.exists()
)
if exists_translation:
msg = ('This project already has a translation for {lang}.')
msg = (
'This project already has a translation for {lang}.'
)
raise forms.ValidationError(
_(msg).format(lang=self.translation.get_language_display()),
_(msg).format(lang=self.translation.get_language_display())
)
is_parent = self.translation.translations.exists()
if is_parent:
@ -635,9 +637,9 @@ class TranslationBaseForm(forms.Form):
def get_translation_queryset(self):
queryset = (
Project.objects.for_admin_user(self.user).filter(
main_language_project=None
).exclude(pk=self.parent.pk)
Project.objects.for_admin_user(self.user)
.filter(main_language_project=None)
.exclude(pk=self.parent.pk)
)
return queryset
@ -657,13 +659,13 @@ class RedirectForm(forms.ModelForm):
"""Form for project redirects."""
class Meta:
class Meta(object):
model = Redirect
fields = ['redirect_type', 'from_url', 'to_url']
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(RedirectForm, self).__init__(*args, **kwargs)
def save(self, **_): # pylint: disable=arguments-differ
# TODO this should respect the unused argument `commit`. It's not clear
@ -684,13 +686,13 @@ class DomainBaseForm(forms.ModelForm):
project = forms.CharField(widget=forms.HiddenInput(), required=False)
class Meta:
class Meta(object):
model = Domain
exclude = ['machine', 'cname', 'count'] # pylint: disable=modelform-uses-exclude
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(DomainBaseForm, self).__init__(*args, **kwargs)
def clean_project(self):
return self.project
@ -707,12 +709,10 @@ class DomainBaseForm(forms.ModelForm):
canonical = self.cleaned_data['canonical']
_id = self.initial.get('id')
if canonical and Domain.objects.filter(
project=self.project,
canonical=True,
project=self.project, canonical=True
).exclude(pk=_id).exists():
raise forms.ValidationError(
_('Only 1 Domain can be canonical at a time.'),
)
_('Only 1 Domain can be canonical at a time.'))
return canonical
@ -730,13 +730,13 @@ class IntegrationForm(forms.ModelForm):
project = forms.CharField(widget=forms.HiddenInput(), required=False)
class Meta:
class Meta(object):
model = Integration
exclude = ['provider_data', 'exchanges'] # pylint: disable=modelform-uses-exclude
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(IntegrationForm, self).__init__(*args, **kwargs)
# Alter the integration type choices to only provider webhooks
self.fields['integration_type'].choices = Integration.WEBHOOK_INTEGRATIONS # yapf: disable # noqa
@ -745,20 +745,20 @@ class IntegrationForm(forms.ModelForm):
def save(self, commit=True):
self.instance = Integration.objects.subclass(self.instance)
return super().save(commit)
return super(IntegrationForm, self).save(commit)
class ProjectAdvertisingForm(forms.ModelForm):
"""Project promotion opt-out form."""
class Meta:
class Meta(object):
model = Project
fields = ['allow_promos']
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super().__init__(*args, **kwargs)
super(ProjectAdvertisingForm, self).__init__(*args, **kwargs)
class FeatureForm(forms.ModelForm):
@ -773,10 +773,56 @@ class FeatureForm(forms.ModelForm):
feature_id = forms.ChoiceField()
class Meta:
class Meta(object):
model = Feature
fields = ['projects', 'feature_id', 'default_true']
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
super(FeatureForm, self).__init__(*args, **kwargs)
self.fields['feature_id'].choices = Feature.FEATURES
class EnvironmentVariableForm(forms.ModelForm):
"""
Form to add an EnvironmentVariable to a Project.
This limits the name of the variable.
"""
project = forms.CharField(widget=forms.HiddenInput(), required=False)
class Meta(object):
model = EnvironmentVariable
fields = ('name', 'value', 'project')
def __init__(self, *args, **kwargs):
self.project = kwargs.pop('project', None)
super(EnvironmentVariableForm, self).__init__(*args, **kwargs)
def clean_project(self):
return self.project
def clean_name(self):
name = self.cleaned_data['name']
if name.startswith('__'):
raise forms.ValidationError(
_("Variable name can't start with __ (double underscore)"),
)
elif name.startswith('READTHEDOCS'):
raise forms.ValidationError(
_("Variable name can't start with READTHEDOCS"),
)
elif self.project.environmentvariable_set.filter(name=name).exists():
raise forms.ValidationError(
_('There is already a variable with this name for this project'),
)
elif ' ' in name:
raise forms.ValidationError(
_("Variable name can't contain spaces"),
)
elif not fullmatch('[a-zA-Z0-9_]+', name):
raise forms.ValidationError(
_('Only letters, numbers and underscore are allowed'),
)
return name

View File

@ -0,0 +1,28 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.16 on 2018-12-17 17:32
from __future__ import unicode_literals
from django.db import migrations, models
def migrate_auto_doctype(apps, schema_editor):
Project = apps.get_model('projects', 'Project')
Project.objects.filter(documentation_type='auto').update(
documentation_type='sphinx',
)
class Migration(migrations.Migration):
dependencies = [
('projects', '0035_container_time_limit_as_integer'),
]
operations = [
migrations.RunPython(migrate_auto_doctype),
migrations.AlterField(
model_name='project',
name='documentation_type',
field=models.CharField(choices=[('sphinx', 'Sphinx Html'), ('mkdocs', 'Mkdocs (Markdown)'), ('sphinx_htmldir', 'Sphinx HtmlDir'), ('sphinx_singlehtml', 'Sphinx Single Page HTML')], default='sphinx', help_text='Type of documentation you are building. <a href="http://www.sphinx-doc.org/en/stable/builders.html#sphinx.builders.html.DirectoryHTMLBuilder">More info</a>.', max_length=20, verbose_name='Documentation type'),
),
]

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,11 @@
# -*- coding: utf-8 -*-
"""Project notifications"""
"""Project notifications."""
from __future__ import absolute_import
from datetime import timedelta
from django.utils import timezone
from django.http import HttpRequest
from messages_extends.models import Message
from readthedocs.notifications import Notification
from readthedocs.notifications.constants import REQUIREMENT
@ -12,3 +16,40 @@ class ResourceUsageNotification(Notification):
context_object_name = 'project'
subject = 'Builds for {{ project.name }} are using too many resources'
level = REQUIREMENT
class DeprecatedViewNotification(Notification):
"""Notification to alert user of a view that is going away."""
context_object_name = 'project'
subject = '{{ project.name }} project webhook needs to be updated'
level = REQUIREMENT
@classmethod
def notify_project_users(cls, projects):
"""
Notify project users of deprecated view.
:param projects: List of project instances
:type projects: [:py:class:`Project`]
"""
for project in projects:
# Send one notification to each owner of the project
for user in project.users.all():
notification = cls(
context_object=project,
request=HttpRequest(),
user=user,
)
notification.send()
class DeprecatedGitHubWebhookNotification(DeprecatedViewNotification):
name = 'deprecated_github_webhook'
class DeprecatedBuildWebhookNotification(DeprecatedViewNotification):
name = 'deprecated_build_webhook'

View File

@ -63,6 +63,7 @@ class ProjectQuerySetBase(models.QuerySet):
The check consists on,
* the Project shouldn't be marked as skipped.
* any of the project's owners is banned.
:param project: project to be checked
:type project: readthedocs.projects.models.Project
@ -70,7 +71,8 @@ class ProjectQuerySetBase(models.QuerySet):
:returns: whether or not the project is active
:rtype: bool
"""
if project.skip:
any_owner_banned = any(u.profile.banned for u in project.users.all())
if project.skip or any_owner_banned:
return False
return True

View File

@ -1,16 +1,19 @@
# -*- coding: utf-8 -*-
"""Project signals"""
"""Project signals."""
from __future__ import absolute_import
import django.dispatch
before_vcs = django.dispatch.Signal(providing_args=['version'])
after_vcs = django.dispatch.Signal(providing_args=['version'])
before_vcs = django.dispatch.Signal(providing_args=["version"])
after_vcs = django.dispatch.Signal(providing_args=["version"])
before_build = django.dispatch.Signal(providing_args=['version'])
after_build = django.dispatch.Signal(providing_args=['version'])
before_build = django.dispatch.Signal(providing_args=["version"])
after_build = django.dispatch.Signal(providing_args=["version"])
project_import = django.dispatch.Signal(providing_args=['project'])
project_import = django.dispatch.Signal(providing_args=["project"])
files_changed = django.dispatch.Signal(providing_args=['project', 'files'])
files_changed = django.dispatch.Signal(providing_args=["project", "files"])
# Used to force verify a domain (eg. for SSL cert issuance)
domain_verify = django.dispatch.Signal(providing_args=["domain"])

View File

@ -73,6 +73,7 @@ from .signals import (
before_build,
before_vcs,
files_changed,
domain_verify,
)
@ -900,19 +901,19 @@ def sync_files(
# Clean up unused artifacts
version = Version.objects.get(pk=version_pk)
if not pdf:
remove_dir(
remove_dirs([
version.project.get_production_media_path(
type_='pdf',
version_slug=version.slug,
),
)
])
if not epub:
remove_dir(
remove_dirs([
version.project.get_production_media_path(
type_='epub',
version_slug=version.slug,
),
)
])
# Sync files to the web servers
move_files(
@ -1371,27 +1372,18 @@ def update_static_metadata(project_pk, path=None):
# Random Tasks
@app.task()
def remove_dir(path):
def remove_dirs(paths):
"""
Remove a directory on the build/celery server.
Remove artifacts from servers.
This is mainly a wrapper around shutil.rmtree so that app servers can kill
things on the build server.
"""
log.info('Removing %s', path)
shutil.rmtree(path, ignore_errors=True)
This is mainly a wrapper around shutil.rmtree so that we can remove things across
every instance of a type of server (eg. all builds or all webs).
@app.task()
def clear_artifacts(paths):
"""
Remove artifacts from the web servers.
:param paths: list containing PATHs where production media is on disk
(usually ``Version.get_artifact_paths``)
:param paths: list containing PATHs where file is on disk
"""
for path in paths:
remove_dir(path)
log.info('Removing %s', path)
shutil.rmtree(path, ignore_errors=True)
@app.task(queue='web')
@ -1450,3 +1442,17 @@ def finish_inactive_builds():
'Builds marked as "Terminated due inactivity": %s',
builds_finished,
)
@app.task(queue='web')
def retry_domain_verification(domain_pk):
"""
Trigger domain verification on a domain
:param domain_pk: a `Domain` pk to verify
"""
domain = Domain.objects.get(pk=domain_pk)
domain_verify.send(
sender=domain.__class__,
domain=domain,
)

View File

@ -1,7 +1,12 @@
# -*- coding: utf-8 -*-
"""Project URLs for authenticated users."""
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
from django.conf.urls import url
from readthedocs.constants import pattern_opts
@ -12,6 +17,10 @@ from readthedocs.projects.views.private import (
DomainDelete,
DomainList,
DomainUpdate,
EnvironmentVariableCreate,
EnvironmentVariableDelete,
EnvironmentVariableList,
EnvironmentVariableDetail,
ImportView,
IntegrationCreate,
IntegrationDelete,
@ -25,259 +34,177 @@ from readthedocs.projects.views.private import (
ProjectUpdate,
)
urlpatterns = [
url(
r'^$',
url(r'^$',
ProjectDashboard.as_view(),
name='projects_dashboard',
),
name='projects_dashboard'),
url(
r'^import/$',
url(r'^import/$',
ImportView.as_view(wizard_class=ImportWizardView),
{'wizard': ImportWizardView},
name='projects_import',
),
name='projects_import'),
url(
r'^import/manual/$',
url(r'^import/manual/$',
ImportWizardView.as_view(),
name='projects_import_manual',
),
name='projects_import_manual'),
url(
r'^import/manual/demo/$',
url(r'^import/manual/demo/$',
ImportDemoView.as_view(),
name='projects_import_demo',
),
name='projects_import_demo'),
url(
r'^(?P<project_slug>[-\w]+)/$',
url(r'^(?P<project_slug>[-\w]+)/$',
private.project_manage,
name='projects_manage',
),
name='projects_manage'),
url(
r'^(?P<project_slug>[-\w]+)/edit/$',
url(r'^(?P<project_slug>[-\w]+)/edit/$',
ProjectUpdate.as_view(),
name='projects_edit',
),
name='projects_edit'),
url(
r'^(?P<project_slug>[-\w]+)/advanced/$',
url(r'^(?P<project_slug>[-\w]+)/advanced/$',
ProjectAdvancedUpdate.as_view(),
name='projects_advanced',
),
name='projects_advanced'),
url(
r'^(?P<project_slug>[-\w]+)/version/(?P<version_slug>[^/]+)/delete_html/$',
url(r'^(?P<project_slug>[-\w]+)/version/(?P<version_slug>[^/]+)/delete_html/$',
private.project_version_delete_html,
name='project_version_delete_html',
),
name='project_version_delete_html'),
url(
r'^(?P<project_slug>[-\w]+)/version/(?P<version_slug>[^/]+)/$',
url(r'^(?P<project_slug>[-\w]+)/version/(?P<version_slug>[^/]+)/$',
private.project_version_detail,
name='project_version_detail',
),
name='project_version_detail'),
url(
r'^(?P<project_slug>[-\w]+)/versions/$',
url(r'^(?P<project_slug>[-\w]+)/versions/$',
private.project_versions,
name='projects_versions',
),
name='projects_versions'),
url(
r'^(?P<project_slug>[-\w]+)/delete/$',
url(r'^(?P<project_slug>[-\w]+)/delete/$',
private.project_delete,
name='projects_delete',
),
name='projects_delete'),
url(
r'^(?P<project_slug>[-\w]+)/users/$',
url(r'^(?P<project_slug>[-\w]+)/users/$',
private.project_users,
name='projects_users',
),
name='projects_users'),
url(
r'^(?P<project_slug>[-\w]+)/users/delete/$',
url(r'^(?P<project_slug>[-\w]+)/users/delete/$',
private.project_users_delete,
name='projects_users_delete',
),
name='projects_users_delete'),
url(
r'^(?P<project_slug>[-\w]+)/notifications/$',
url(r'^(?P<project_slug>[-\w]+)/notifications/$',
private.project_notifications,
name='projects_notifications',
),
name='projects_notifications'),
url(
r'^(?P<project_slug>[-\w]+)/notifications/delete/$',
url(r'^(?P<project_slug>[-\w]+)/notifications/delete/$',
private.project_notifications_delete,
name='projects_notification_delete',
),
name='projects_notification_delete'),
url(
r'^(?P<project_slug>[-\w]+)/translations/$',
url(r'^(?P<project_slug>[-\w]+)/translations/$',
private.project_translations,
name='projects_translations',
),
name='projects_translations'),
url(
r'^(?P<project_slug>[-\w]+)/translations/delete/(?P<child_slug>[-\w]+)/$', # noqa
url(r'^(?P<project_slug>[-\w]+)/translations/delete/(?P<child_slug>[-\w]+)/$', # noqa
private.project_translations_delete,
name='projects_translations_delete',
),
name='projects_translations_delete'),
url(
r'^(?P<project_slug>[-\w]+)/redirects/$',
url(r'^(?P<project_slug>[-\w]+)/redirects/$',
private.project_redirects,
name='projects_redirects',
),
name='projects_redirects'),
url(
r'^(?P<project_slug>[-\w]+)/redirects/delete/$',
url(r'^(?P<project_slug>[-\w]+)/redirects/delete/$',
private.project_redirects_delete,
name='projects_redirects_delete',
),
name='projects_redirects_delete'),
url(
r'^(?P<project_slug>[-\w]+)/advertising/$',
url(r'^(?P<project_slug>[-\w]+)/advertising/$',
ProjectAdvertisingUpdate.as_view(),
name='projects_advertising',
),
name='projects_advertising'),
]
domain_urls = [
url(
r'^(?P<project_slug>[-\w]+)/domains/$',
url(r'^(?P<project_slug>[-\w]+)/domains/$',
DomainList.as_view(),
name='projects_domains',
),
url(
r'^(?P<project_slug>[-\w]+)/domains/create/$',
name='projects_domains'),
url(r'^(?P<project_slug>[-\w]+)/domains/create/$',
DomainCreate.as_view(),
name='projects_domains_create',
),
url(
r'^(?P<project_slug>[-\w]+)/domains/(?P<domain_pk>[-\w]+)/edit/$',
name='projects_domains_create'),
url(r'^(?P<project_slug>[-\w]+)/domains/(?P<domain_pk>[-\w]+)/edit/$',
DomainUpdate.as_view(),
name='projects_domains_edit',
),
url(
r'^(?P<project_slug>[-\w]+)/domains/(?P<domain_pk>[-\w]+)/delete/$',
name='projects_domains_edit'),
url(r'^(?P<project_slug>[-\w]+)/domains/(?P<domain_pk>[-\w]+)/delete/$',
DomainDelete.as_view(),
name='projects_domains_delete',
),
name='projects_domains_delete'),
]
urlpatterns += domain_urls
integration_urls = [
url(
r'^(?P<project_slug>{project_slug})/integrations/$'.format(
**pattern_opts
),
url(r'^(?P<project_slug>{project_slug})/integrations/$'.format(**pattern_opts),
IntegrationList.as_view(),
name='projects_integrations',
),
url(
r'^(?P<project_slug>{project_slug})/integrations/sync/$'.format(
**pattern_opts
),
name='projects_integrations'),
url(r'^(?P<project_slug>{project_slug})/integrations/sync/$'.format(**pattern_opts),
IntegrationWebhookSync.as_view(),
name='projects_integrations_webhooks_sync',
),
url(
(
r'^(?P<project_slug>{project_slug})/integrations/create/$'.format(
**pattern_opts
)
),
name='projects_integrations_webhooks_sync'),
url((r'^(?P<project_slug>{project_slug})/integrations/create/$'
.format(**pattern_opts)),
IntegrationCreate.as_view(),
name='projects_integrations_create',
),
url(
(
r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/$'.format(
**pattern_opts
)
),
name='projects_integrations_create'),
url((r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/$'
.format(**pattern_opts)),
IntegrationDetail.as_view(),
name='projects_integrations_detail',
),
url(
(
r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/'
r'exchange/(?P<exchange_pk>[-\w]+)/$'.format(**pattern_opts)
),
name='projects_integrations_detail'),
url((r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/'
r'exchange/(?P<exchange_pk>[-\w]+)/$'
.format(**pattern_opts)),
IntegrationExchangeDetail.as_view(),
name='projects_integrations_exchanges_detail',
),
url(
(
r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/sync/$'.format(
**pattern_opts
)
),
name='projects_integrations_exchanges_detail'),
url((r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/sync/$'
.format(**pattern_opts)),
IntegrationWebhookSync.as_view(),
name='projects_integrations_webhooks_sync',
),
url(
(
r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/delete/$'.format(
**pattern_opts
)
),
name='projects_integrations_webhooks_sync'),
url((r'^(?P<project_slug>{project_slug})/'
r'integrations/(?P<integration_pk>{integer_pk})/delete/$'
.format(**pattern_opts)),
IntegrationDelete.as_view(),
name='projects_integrations_delete',
),
name='projects_integrations_delete'),
]
urlpatterns += integration_urls
subproject_urls = [
url(
r'^(?P<project_slug>{project_slug})/subprojects/$'.format(
**pattern_opts
),
url(r'^(?P<project_slug>{project_slug})/subprojects/$'.format(**pattern_opts),
private.ProjectRelationshipList.as_view(),
name='projects_subprojects',
),
url(
(
r'^(?P<project_slug>{project_slug})/subprojects/create/$'.format(
**pattern_opts
)
),
name='projects_subprojects'),
url((r'^(?P<project_slug>{project_slug})/subprojects/create/$'
.format(**pattern_opts)),
private.ProjectRelationshipCreate.as_view(),
name='projects_subprojects_create',
),
url(
(
r'^(?P<project_slug>{project_slug})/'
r'subprojects/(?P<subproject_slug>{project_slug})/edit/$'.format(
**pattern_opts
)
),
name='projects_subprojects_create'),
url((r'^(?P<project_slug>{project_slug})/'
r'subprojects/(?P<subproject_slug>{project_slug})/edit/$'
.format(**pattern_opts)),
private.ProjectRelationshipUpdate.as_view(),
name='projects_subprojects_update',
),
url(
(
r'^(?P<project_slug>{project_slug})/'
r'subprojects/(?P<subproject_slug>{project_slug})/delete/$'.format(
**pattern_opts
)
),
name='projects_subprojects_update'),
url((r'^(?P<project_slug>{project_slug})/'
r'subprojects/(?P<subproject_slug>{project_slug})/delete/$'
.format(**pattern_opts)),
private.ProjectRelationshipDelete.as_view(),
name='projects_subprojects_delete',
),
name='projects_subprojects_delete'),
]
urlpatterns += subproject_urls
environmentvariable_urls = [
url(r'^(?P<project_slug>[-\w]+)/environmentvariables/$',
EnvironmentVariableList.as_view(),
name='projects_environmentvariables'),
url(r'^(?P<project_slug>[-\w]+)/environmentvariables/create/$',
EnvironmentVariableCreate.as_view(),
name='projects_environmentvariables_create'),
url(r'^(?P<project_slug>[-\w]+)/environmentvariables/(?P<environmentvariable_pk>[-\w]+)/$',
EnvironmentVariableDetail.as_view(),
name='projects_environmentvariables_detail'),
url(r'^(?P<project_slug>[-\w]+)/environmentvariables/(?P<environmentvariable_pk>[-\w]+)/delete/$',
EnvironmentVariableDelete.as_view(),
name='projects_environmentvariables_delete'),
]
urlpatterns += environmentvariable_urls

View File

@ -36,6 +36,7 @@ from readthedocs.projects import tasks
from readthedocs.projects.forms import (
DomainForm,
EmailHookForm,
EnvironmentVariableForm,
IntegrationForm,
ProjectAdvancedForm,
ProjectAdvertisingForm,
@ -52,12 +53,14 @@ from readthedocs.projects.forms import (
from readthedocs.projects.models import (
Domain,
EmailHook,
EnvironmentVariable,
Project,
ProjectRelationship,
WebHook,
)
from readthedocs.projects.signals import project_import
from readthedocs.projects.views.base import ProjectAdminMixin, ProjectSpamMixin
from ..tasks import retry_domain_verification
log = logging.getLogger(__name__)
@ -186,7 +189,7 @@ def project_version_detail(request, project_slug, version_slug):
log.info('Removing files for version %s', version.slug)
broadcast(
type='app',
task=tasks.clear_artifacts,
task=tasks.remove_dirs,
args=[version.get_artifact_paths()],
)
version.built = False
@ -215,7 +218,11 @@ def project_delete(request, project_slug):
)
if request.method == 'POST':
broadcast(type='app', task=tasks.remove_dir, args=[project.doc_path])
broadcast(
type='app',
task=tasks.remove_dirs,
args=[(project.doc_path,)]
)
project.delete()
messages.success(request, _('Project deleted'))
project_dashboard = reverse('projects_dashboard')
@ -695,7 +702,7 @@ def project_version_delete_html(request, project_slug, version_slug):
version.save()
broadcast(
type='app',
task=tasks.clear_artifacts,
task=tasks.remove_dirs,
args=[version.get_artifact_paths()],
)
else:
@ -717,7 +724,14 @@ class DomainMixin(ProjectAdminMixin, PrivateViewMixin):
class DomainList(DomainMixin, ListViewWithForm):
pass
def get_context_data(self, **kwargs):
ctx = super(DomainList, self).get_context_data(**kwargs)
# Retry validation on all domains if applicable
for domain in ctx['domain_list']:
retry_domain_verification.delay(domain_pk=domain.pk)
return ctx
class DomainCreate(DomainMixin, CreateView):
@ -866,3 +880,37 @@ class ProjectAdvertisingUpdate(PrivateViewMixin, UpdateView):
def get_success_url(self):
return reverse('projects_advertising', args=[self.object.slug])
class EnvironmentVariableMixin(ProjectAdminMixin, PrivateViewMixin):
"""Environment Variables to be added when building the Project."""
model = EnvironmentVariable
form_class = EnvironmentVariableForm
lookup_url_kwarg = 'environmentvariable_pk'
def get_success_url(self):
return reverse(
'projects_environmentvariables',
args=[self.get_project().slug],
)
class EnvironmentVariableList(EnvironmentVariableMixin, ListView):
pass
class EnvironmentVariableCreate(EnvironmentVariableMixin, CreateView):
pass
class EnvironmentVariableDetail(EnvironmentVariableMixin, DetailView):
pass
class EnvironmentVariableDelete(EnvironmentVariableMixin, DeleteView):
# This removes the delete confirmation
def get(self, request, *args, **kwargs):
return self.http_method_not_allowed(request, *args, **kwargs)

View File

@ -47,6 +47,12 @@ class ProjectAdminSerializer(ProjectSerializer):
slug_field='feature_id',
)
def get_environment_variables(self, obj):
return {
variable.name: variable.value
for variable in obj.environmentvariable_set.all()
}
class Meta(ProjectSerializer.Meta):
fields = ProjectSerializer.Meta.fields + (
'enable_epub_build',

View File

@ -59,7 +59,7 @@ class ProjectAdminActionsTest(TestCase):
@mock.patch('readthedocs.projects.admin.broadcast')
def test_project_delete(self, broadcast):
"""Test project and artifacts are removed"""
from readthedocs.projects.tasks import remove_dir
from readthedocs.projects.tasks import remove_dirs
action_data = {
ACTION_CHECKBOX_NAME: [self.project.pk],
'action': 'delete_selected',
@ -73,6 +73,6 @@ class ProjectAdminActionsTest(TestCase):
self.assertFalse(Project.objects.filter(pk=self.project.pk).exists())
broadcast.assert_has_calls([
mock.call(
type='app', task=remove_dir, args=[self.project.doc_path]
type='app', task=remove_dirs, args=[(self.project.doc_path,)]
),
])

View File

@ -107,6 +107,17 @@ class TestGitBackend(RTDTestCase):
self.assertEqual(code, 0)
self.assertTrue(exists(repo.working_dir))
def test_git_checkout_invalid_revision(self):
repo = self.project.vcs_repo()
repo.update()
version = 'invalid-revision'
with self.assertRaises(RepositoryError) as e:
repo.checkout(version)
self.assertEqual(
str(e.exception),
RepositoryError.FAILED_TO_CHECKOUT.format(version)
)
def test_git_tags(self):
repo_path = self.project.repo
create_git_tag(repo_path, 'v01')
@ -256,6 +267,17 @@ class TestHgBackend(RTDTestCase):
self.assertEqual(code, 0)
self.assertTrue(exists(repo.working_dir))
def test_checkout_invalid_revision(self):
repo = self.project.vcs_repo()
repo.update()
version = 'invalid-revision'
with self.assertRaises(RepositoryError) as e:
repo.checkout(version)
self.assertEqual(
str(e.exception),
RepositoryError.FAILED_TO_CHECKOUT.format(version)
)
def test_parse_tags(self):
data = """\
tip 13575:8e94a1b4e9a4

View File

@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
from __future__ import division, print_function, unicode_literals
import os
import shutil
from os.path import exists
@ -6,21 +8,21 @@ from tempfile import mkdtemp
from django.contrib.auth.models import User
from django_dynamic_fixture import get
from mock import MagicMock, patch
from mock import patch, MagicMock
from readthedocs.builds.constants import LATEST
from readthedocs.builds.models import Build
from readthedocs.projects import tasks
from readthedocs.projects.exceptions import RepositoryError
from readthedocs.builds.models import Build
from readthedocs.projects.models import Project
from readthedocs.projects import tasks
from readthedocs.rtd_tests.utils import (
create_git_branch, create_git_tag, delete_git_branch)
from readthedocs.rtd_tests.utils import make_test_git
from readthedocs.rtd_tests.base import RTDTestCase
from readthedocs.rtd_tests.mocks.mock_api import mock_api
from readthedocs.rtd_tests.utils import (
create_git_branch,
create_git_tag,
delete_git_branch,
make_test_git,
)
from readthedocs.doc_builder.exceptions import VersionLockedError
class TestCeleryBuilding(RTDTestCase):
@ -29,7 +31,7 @@ class TestCeleryBuilding(RTDTestCase):
def setUp(self):
repo = make_test_git()
self.repo = repo
super().setUp()
super(TestCeleryBuilding, self).setUp()
self.eric = User(username='eric')
self.eric.set_password('test')
self.eric.save()
@ -43,12 +45,12 @@ class TestCeleryBuilding(RTDTestCase):
def tearDown(self):
shutil.rmtree(self.repo)
super().tearDown()
super(TestCeleryBuilding, self).tearDown()
def test_remove_dir(self):
def test_remove_dirs(self):
directory = mkdtemp()
self.assertTrue(exists(directory))
result = tasks.remove_dir.delay(directory)
result = tasks.remove_dirs.delay((directory,))
self.assertTrue(result.successful())
self.assertFalse(exists(directory))
@ -57,14 +59,14 @@ class TestCeleryBuilding(RTDTestCase):
directory = self.project.get_production_media_path(type_='pdf', version_slug=version.slug)
os.makedirs(directory)
self.assertTrue(exists(directory))
result = tasks.clear_artifacts.delay(paths=version.get_artifact_paths())
result = tasks.remove_dirs.delay(paths=version.get_artifact_paths())
self.assertTrue(result.successful())
self.assertFalse(exists(directory))
directory = version.project.rtd_build_path(version=version.slug)
os.makedirs(directory)
self.assertTrue(exists(directory))
result = tasks.clear_artifacts.delay(paths=version.get_artifact_paths())
result = tasks.remove_dirs.delay(paths=version.get_artifact_paths())
self.assertTrue(result.successful())
self.assertFalse(exists(directory))
@ -116,6 +118,25 @@ class TestCeleryBuilding(RTDTestCase):
intersphinx=False)
self.assertTrue(result.successful())
@patch('readthedocs.projects.tasks.UpdateDocsTaskStep.setup_python_environment', new=MagicMock)
@patch('readthedocs.projects.tasks.UpdateDocsTaskStep.build_docs', new=MagicMock)
@patch('readthedocs.projects.tasks.UpdateDocsTaskStep.send_notifications')
@patch('readthedocs.projects.tasks.UpdateDocsTaskStep.setup_vcs')
def test_no_notification_on_version_locked_error(self, mock_setup_vcs, mock_send_notifications):
mock_setup_vcs.side_effect = VersionLockedError()
build = get(Build, project=self.project,
version=self.project.versions.first())
with mock_api(self.repo) as mapi:
result = tasks.update_docs_task.delay(
self.project.pk,
build_pk=build.pk,
record=False,
intersphinx=False)
mock_send_notifications.assert_not_called()
self.assertTrue(result.successful())
def test_sync_repository(self):
version = self.project.versions.get(slug=LATEST)
with mock_api(self.repo):

View File

@ -84,8 +84,6 @@ class LoadConfigTests(TestCase):
env_config={
'allow_v2': mock.ANY,
'build': {'image': 'readthedocs/build:1.0'},
'output_base': '',
'name': mock.ANY,
'defaults': {
'install_project': self.project.install_project,
'formats': [

View File

@ -15,6 +15,7 @@ from mock import patch
from readthedocs.builds.models import Version
from readthedocs.doc_builder.backends.mkdocs import MkdocsHTML
from readthedocs.doc_builder.backends.sphinx import BaseSphinx
from readthedocs.doc_builder.exceptions import MkDocsYAMLParseError
from readthedocs.doc_builder.python_environments import Virtualenv
from readthedocs.projects.exceptions import ProjectConfigurationError
from readthedocs.projects.models import Feature, Project
@ -384,6 +385,39 @@ class MkdocsBuilderTest(TestCase):
'mkdocs'
)
@patch('readthedocs.doc_builder.base.BaseBuilder.run')
@patch('readthedocs.projects.models.Project.checkout_path')
def test_append_conf_existing_yaml_on_root_with_invalid_setting(self, checkout_path, run):
tmpdir = tempfile.mkdtemp()
os.mkdir(os.path.join(tmpdir, 'docs'))
yaml_file = os.path.join(tmpdir, 'mkdocs.yml')
checkout_path.return_value = tmpdir
python_env = Virtualenv(
version=self.version,
build_env=self.build_env,
config=None,
)
self.searchbuilder = MkdocsHTML(
build_env=self.build_env,
python_env=python_env,
)
# We can't use ``@pytest.mark.parametrize`` on a Django test case
yaml_contents = [
{'docs_dir': ['docs']},
{'extra_css': 'a string here'},
{'extra_javascript': None},
]
for content in yaml_contents:
yaml.safe_dump(
content,
open(yaml_file, 'w'),
)
with self.assertRaises(MkDocsYAMLParseError):
self.searchbuilder.append_conf()
@patch('readthedocs.doc_builder.base.BaseBuilder.run')
@patch('readthedocs.projects.models.Project.checkout_path')
def test_dont_override_theme(self, checkout_path, run):

View File

@ -1157,8 +1157,9 @@ class TestPythonEnvironment(TestCase):
]
self.pip_install_args = [
'python',
mock.ANY, # pip path
mock.ANY, # python path
'-m',
'pip',
'install',
'--upgrade',
'--cache-dir',
@ -1247,8 +1248,9 @@ class TestPythonEnvironment(TestCase):
os.path.join(checkout_path, 'docs'): True,
}
args = [
'python',
mock.ANY, # pip path
mock.ANY, # python path
'-m',
'pip',
'install',
'--exists-action=w',
'--cache-dir',
@ -1319,8 +1321,9 @@ class TestPythonEnvironment(TestCase):
]
args_pip = [
'python',
mock.ANY, # pip path
mock.ANY, # python path
'-m',
'pip',
'install',
'-U',
'--cache-dir',
@ -1332,6 +1335,7 @@ class TestPythonEnvironment(TestCase):
'conda',
'install',
'--yes',
'--quiet',
'--name',
self.version_sphinx.slug,
]
@ -1358,8 +1362,9 @@ class TestPythonEnvironment(TestCase):
]
args_pip = [
'python',
mock.ANY, # pip path
mock.ANY, # python path
'-m',
'pip',
'install',
'-U',
'--cache-dir',
@ -1371,6 +1376,7 @@ class TestPythonEnvironment(TestCase):
'conda',
'install',
'--yes',
'--quiet',
'--name',
self.version_mkdocs.slug,
]

View File

@ -1,15 +1,24 @@
import django_dynamic_fixture as fixture
# -*- coding: utf-8 -*-
from __future__ import absolute_import, unicode_literals, division, print_function
import mock
from django.conf import settings
from mock import patch, mock_open
import django_dynamic_fixture as fixture
import pytest
import six
from django.contrib.auth.models import User
from django.http import Http404
from django.test import TestCase
from django.test.utils import override_settings
from django.http import Http404
from django.conf import settings
from django.urls import reverse
from readthedocs.core.views.serve import _serve_symlink_docs
from readthedocs.rtd_tests.base import RequestFactoryTestMixin
from readthedocs.projects import constants
from readthedocs.projects.models import Project
from readthedocs.rtd_tests.base import RequestFactoryTestMixin
from readthedocs.core.views.serve import _serve_symlink_docs
@override_settings(
USE_SUBDOMAIN=False, PUBLIC_DOMAIN='public.readthedocs.org', DEBUG=False
@ -54,6 +63,16 @@ class TestPrivateDocs(BaseDocServing):
r._headers['x-accel-redirect'][1], '/private_web_root/private/en/latest/usage.html'
)
@override_settings(PYTHON_MEDIA=False)
def test_private_nginx_serving_unicode_filename(self):
with mock.patch('readthedocs.core.views.serve.os.path.exists', return_value=True):
request = self.request(self.private_url, user=self.eric)
r = _serve_symlink_docs(request, project=self.private, filename='/en/latest/úñíčódé.html', privacy_level='private')
self.assertEqual(r.status_code, 200)
self.assertEqual(
r._headers['x-accel-redirect'][1], '/private_web_root/private/en/latest/%C3%BA%C3%B1%C3%AD%C4%8D%C3%B3d%C3%A9.html'
)
@override_settings(PYTHON_MEDIA=False)
def test_private_files_not_found(self):
request = self.request(self.private_url, user=self.eric)
@ -62,6 +81,28 @@ class TestPrivateDocs(BaseDocServing):
self.assertTrue('private_web_root' in str(exc.exception))
self.assertTrue('public_web_root' not in str(exc.exception))
@override_settings(
PYTHON_MEDIA=False,
USE_SUBDOMAIN=True,
PUBLIC_DOMAIN='readthedocs.io',
ROOT_URLCONF=settings.SUBDOMAIN_URLCONF,
)
def test_robots_txt(self):
self.public.versions.update(active=True, built=True)
response = self.client.get(
reverse('robots_txt'),
HTTP_HOST='private.readthedocs.io',
)
self.assertEqual(response.status_code, 404)
self.client.force_login(self.eric)
response = self.client.get(
reverse('robots_txt'),
HTTP_HOST='private.readthedocs.io',
)
# Private projects/versions always return 404 for robots.txt
self.assertEqual(response.status_code, 404)
@override_settings(SERVE_DOCS=[constants.PRIVATE, constants.PUBLIC])
class TestPublicDocs(BaseDocServing):
@ -95,3 +136,41 @@ class TestPublicDocs(BaseDocServing):
_serve_symlink_docs(request, project=self.private, filename='/en/latest/usage.html', privacy_level='public')
self.assertTrue('private_web_root' not in str(exc.exception))
self.assertTrue('public_web_root' in str(exc.exception))
@override_settings(
PYTHON_MEDIA=False,
USE_SUBDOMAIN=True,
PUBLIC_DOMAIN='readthedocs.io',
ROOT_URLCONF=settings.SUBDOMAIN_URLCONF,
)
def test_default_robots_txt(self):
self.public.versions.update(active=True, built=True)
response = self.client.get(
reverse('robots_txt'),
HTTP_HOST='public.readthedocs.io',
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content, b'User-agent: *\nAllow: /\n')
@override_settings(
PYTHON_MEDIA=False,
USE_SUBDOMAIN=True,
PUBLIC_DOMAIN='readthedocs.io',
ROOT_URLCONF=settings.SUBDOMAIN_URLCONF,
)
@patch(
'builtins.open',
new_callable=mock_open,
read_data='My own robots.txt',
)
@patch('readthedocs.core.views.serve.os')
@pytest.mark.skipif(six.PY2, reason='In Python2 the mock is __builtins__.open')
def test_custom_robots_txt(self, os_mock, open_mock):
os_mock.path.exists.return_value = True
self.public.versions.update(active=True, built=True)
response = self.client.get(
reverse('robots_txt'),
HTTP_HOST='public.readthedocs.io',
)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content, b'My own robots.txt')

View File

@ -1,21 +1,27 @@
# -*- coding: utf-8 -*-
"""Notification tests"""
import django_dynamic_fixture as fixture
from __future__ import absolute_import
from datetime import timedelta
import mock
from django.contrib.auth.models import AnonymousUser, User
import django_dynamic_fixture as fixture
from django.http import HttpRequest
from django.test import TestCase
from django.test.utils import override_settings
from django.contrib.auth.models import User, AnonymousUser
from django.utils import timezone
from messages_extends.models import Message as PersistentMessage
from readthedocs.builds.models import Build
from readthedocs.notifications import Notification, SiteNotification
from readthedocs.notifications.backends import EmailBackend, SiteBackend
from readthedocs.notifications.constants import (
ERROR,
INFO_NON_PERSISTENT,
WARNING_NON_PERSISTENT,
from readthedocs.notifications.constants import ERROR, INFO_NON_PERSISTENT, WARNING_NON_PERSISTENT
from readthedocs.projects.models import Project
from readthedocs.projects.notifications import (
DeprecatedGitHubWebhookNotification,
DeprecatedBuildWebhookNotification,
)
from readthedocs.builds.models import Build
@override_settings(
NOTIFICATION_BACKENDS=[
@ -45,7 +51,7 @@ class NotificationTests(TestCase):
self.assertEqual(notify.get_template_names('site'),
['builds/notifications/foo_site.html'])
self.assertEqual(notify.get_subject(),
'This is {}'.format(build.id))
'This is {0}'.format(build.id))
self.assertEqual(notify.get_context_data(),
{'foo': build,
'production_uri': 'https://readthedocs.org',
@ -84,7 +90,7 @@ class NotificationBackendTests(TestCase):
request=mock.ANY,
template='core/email/common.txt',
context={'content': 'Test'},
subject='This is {}'.format(build.id),
subject=u'This is {}'.format(build.id),
template_html='core/email/common.html',
recipient=user.email,
)
@ -224,3 +230,43 @@ class SiteNotificationTests(TestCase):
with mock.patch('readthedocs.notifications.notification.log') as mock_log:
self.assertEqual(self.n.get_message(False), '')
mock_log.error.assert_called_once()
class DeprecatedWebhookEndpointNotificationTests(TestCase):
def setUp(self):
PersistentMessage.objects.all().delete()
self.user = fixture.get(User)
self.project = fixture.get(Project, users=[self.user])
self.request = HttpRequest()
self.notification = DeprecatedBuildWebhookNotification(
self.project,
self.request,
self.user,
)
@mock.patch('readthedocs.notifications.backends.send_email')
def test_dedupliation(self, send_email):
user = fixture.get(User)
project = fixture.get(Project, main_language_project=None)
project.users.add(user)
project.refresh_from_db()
self.assertEqual(project.users.count(), 1)
self.assertEqual(PersistentMessage.objects.filter(user=user).count(), 0)
DeprecatedGitHubWebhookNotification.notify_project_users([project])
# Site and email notification will go out, site message doesn't have
# any reason to deduplicate yet
self.assertEqual(PersistentMessage.objects.filter(user=user).count(), 1)
self.assertTrue(send_email.called)
send_email.reset_mock()
self.assertFalse(send_email.called)
# Expect the site message to deduplicate, the email won't
DeprecatedGitHubWebhookNotification.notify_project_users([project])
self.assertEqual(PersistentMessage.objects.filter(user=user).count(), 1)
self.assertTrue(send_email.called)
send_email.reset_mock()

View File

@ -1,21 +1,25 @@
from __future__ import absolute_import
from __future__ import print_function
import re
import mock
from allauth.socialaccount.models import SocialAccount
from builtins import object
from django.contrib.admindocs.views import extract_views_from_urlpatterns
from django.test import TestCase
from django.urls import reverse
from django_dynamic_fixture import get
import mock
from taggit.models import Tag
from readthedocs.builds.models import Build, BuildCommandResult
from readthedocs.core.utils.tasks import TaskNoPermission
from readthedocs.integrations.models import HttpExchange, Integration
from readthedocs.oauth.models import RemoteOrganization, RemoteRepository
from readthedocs.projects.models import Domain, Project
from readthedocs.projects.models import Project, Domain, EnvironmentVariable
from readthedocs.oauth.models import RemoteRepository, RemoteOrganization
from readthedocs.rtd_tests.utils import create_user
class URLAccessMixin:
class URLAccessMixin(object):
default_kwargs = {}
response_data = {}
@ -89,10 +93,10 @@ class URLAccessMixin:
for not_obj in self.context_data:
if isinstance(obj, list) or isinstance(obj, set) or isinstance(obj, tuple):
self.assertNotIn(not_obj, obj)
print("{} not in {}".format(not_obj, obj))
print("%s not in %s" % (not_obj, obj))
else:
self.assertNotEqual(not_obj, obj)
print("{} is not {}".format(not_obj, obj))
print("%s is not %s" % (not_obj, obj))
def _test_url(self, urlpatterns):
deconstructed_urls = extract_views_from_urlpatterns(urlpatterns)
@ -130,7 +134,7 @@ class URLAccessMixin:
class ProjectMixin(URLAccessMixin):
def setUp(self):
super().setUp()
super(ProjectMixin, self).setUp()
self.build = get(Build, project=self.pip)
self.tag = get(Tag, slug='coolness')
self.subproject = get(Project, slug='sub', language='ja',
@ -146,6 +150,7 @@ class ProjectMixin(URLAccessMixin):
status_code=200,
)
self.domain = get(Domain, url='http://docs.foobar.com', project=self.pip)
self.environment_variable = get(EnvironmentVariable, project=self.pip)
self.default_kwargs = {
'project_slug': self.pip.slug,
'subproject_slug': self.subproject.slug,
@ -158,6 +163,7 @@ class ProjectMixin(URLAccessMixin):
'domain_pk': self.domain.pk,
'integration_pk': self.integration.pk,
'exchange_pk': self.webhook_exchange.pk,
'environmentvariable_pk': self.environment_variable.pk,
}
@ -237,11 +243,13 @@ class PrivateProjectAdminAccessTest(PrivateProjectMixin, TestCase):
'/dashboard/pip/integrations/sync/': {'status_code': 405},
'/dashboard/pip/integrations/{integration_id}/sync/': {'status_code': 405},
'/dashboard/pip/integrations/{integration_id}/delete/': {'status_code': 405},
'/dashboard/pip/environmentvariables/{environmentvariable_id}/delete/': {'status_code': 405},
}
def get_url_path_ctx(self):
return {
'integration_id': self.integration.id,
'environmentvariable_id': self.environment_variable.id,
}
def login(self):
@ -271,6 +279,7 @@ class PrivateProjectUserAccessTest(PrivateProjectMixin, TestCase):
'/dashboard/pip/integrations/sync/': {'status_code': 405},
'/dashboard/pip/integrations/{integration_id}/sync/': {'status_code': 405},
'/dashboard/pip/integrations/{integration_id}/delete/': {'status_code': 405},
'/dashboard/pip/environmentvariables/{environmentvariable_id}/delete/': {'status_code': 405},
}
# Filtered out by queryset on projects that we don't own.
@ -279,6 +288,7 @@ class PrivateProjectUserAccessTest(PrivateProjectMixin, TestCase):
def get_url_path_ctx(self):
return {
'integration_id': self.integration.id,
'environmentvariable_id': self.environment_variable.id,
}
def login(self):
@ -303,7 +313,7 @@ class PrivateProjectUnauthAccessTest(PrivateProjectMixin, TestCase):
class APIMixin(URLAccessMixin):
def setUp(self):
super().setUp()
super(APIMixin, self).setUp()
self.build = get(Build, project=self.pip)
self.build_command_result = get(BuildCommandResult, project=self.pip)
self.domain = get(Domain, url='http://docs.foobar.com', project=self.pip)

View File

@ -18,13 +18,14 @@ from readthedocs.projects.constants import (
)
from readthedocs.projects.exceptions import ProjectSpamError
from readthedocs.projects.forms import (
EnvironmentVariableForm,
ProjectAdvancedForm,
ProjectBasicsForm,
ProjectExtraForm,
TranslationForm,
UpdateProjectForm,
)
from readthedocs.projects.models import Project
from readthedocs.projects.models import Project, EnvironmentVariable
class TestProjectForms(TestCase):
@ -503,3 +504,89 @@ class TestTranslationForms(TestCase):
instance=self.project_b_en
)
self.assertTrue(form.is_valid())
class TestProjectEnvironmentVariablesForm(TestCase):
def setUp(self):
self.project = get(Project)
def test_use_invalid_names(self):
data = {
'name': 'VARIABLE WITH SPACES',
'value': 'string here',
}
form = EnvironmentVariableForm(data, project=self.project)
self.assertFalse(form.is_valid())
self.assertIn(
"Variable name can't contain spaces",
form.errors['name'],
)
data = {
'name': 'READTHEDOCS__INVALID',
'value': 'string here',
}
form = EnvironmentVariableForm(data, project=self.project)
self.assertFalse(form.is_valid())
self.assertIn(
"Variable name can't start with READTHEDOCS",
form.errors['name'],
)
data = {
'name': 'INVALID_CHAR*',
'value': 'string here',
}
form = EnvironmentVariableForm(data, project=self.project)
self.assertFalse(form.is_valid())
self.assertIn(
'Only letters, numbers and underscore are allowed',
form.errors['name'],
)
data = {
'name': '__INVALID',
'value': 'string here',
}
form = EnvironmentVariableForm(data, project=self.project)
self.assertFalse(form.is_valid())
self.assertIn(
"Variable name can't start with __ (double underscore)",
form.errors['name'],
)
get(EnvironmentVariable, name='EXISTENT_VAR', project=self.project)
data = {
'name': 'EXISTENT_VAR',
'value': 'string here',
}
form = EnvironmentVariableForm(data, project=self.project)
self.assertFalse(form.is_valid())
self.assertIn(
'There is already a variable with this name for this project',
form.errors['name'],
)
def test_create(self):
data = {
'name': 'MYTOKEN',
'value': 'string here',
}
form = EnvironmentVariableForm(data, project=self.project)
form.save()
self.assertEqual(EnvironmentVariable.objects.count(), 1)
self.assertEqual(EnvironmentVariable.objects.first().name, 'MYTOKEN')
self.assertEqual(EnvironmentVariable.objects.first().value, "'string here'")
data = {
'name': 'ESCAPED',
'value': r'string escaped here: #$\1[]{}\|',
}
form = EnvironmentVariableForm(data, project=self.project)
form.save()
self.assertEqual(EnvironmentVariable.objects.count(), 2)
self.assertEqual(EnvironmentVariable.objects.first().name, 'ESCAPED')
self.assertEqual(EnvironmentVariable.objects.first().value, r"'string escaped here: #$\1[]{}\|'")

View File

@ -1,13 +1,14 @@
# -*- coding: utf-8 -*-
from django.contrib.auth.models import User
from datetime import timedelta
import django_dynamic_fixture as fixture
from django.test import TestCase
from readthedocs.projects.models import Feature, Project, ProjectRelationship
from readthedocs.projects.querysets import (
ChildRelatedProjectQuerySet,
ParentRelatedProjectQuerySet,
)
from readthedocs.projects.models import Project, Feature
from readthedocs.projects.querysets import (ParentRelatedProjectQuerySet,
ChildRelatedProjectQuerySet)
class ProjectQuerySetTests(TestCase):
@ -36,6 +37,12 @@ class ProjectQuerySetTests(TestCase):
project = fixture.get(Project, skip=True)
self.assertFalse(Project.objects.is_active(project))
user = fixture.get(User)
user.profile.banned = True
user.profile.save()
project = fixture.get(Project, skip=False, users=[user])
self.assertFalse(Project.objects.is_active(project))
class FeatureQuerySetTests(TestCase):

View File

@ -376,8 +376,8 @@ class TestPrivateViews(MockBuildTestCase):
self.assertFalse(Project.objects.filter(slug='pip').exists())
broadcast.assert_called_with(
type='app',
task=tasks.remove_dir,
args=[project.doc_path])
task=tasks.remove_dirs,
args=[(project.doc_path,)])
def test_subproject_create(self):
project = get(Project, slug='pip', users=[self.user])

View File

@ -16,24 +16,6 @@
{% block language-select-form %}{% endblock %}
{% block content %}
{% if suggestion %}
<div class="suggestions">
<h1>You've found something that doesn't exist.</h1>
<p>{{ suggestion.message }}</p>
{% ifequal suggestion.type 'top' %}
<p>
<a href="{{ suggestion.href }}">Go to the top of the documentation.</a>
</p>
{% endifequal %}
{% ifequal suggestion.type 'list' %}
<ul>
{% for item in suggestion.list %}
<li><a href="{% doc_url item.project item.version_slug item.pagename %}">{{ item.label }}</a></li>
{% endfor %}
</ul>
{% endifequal %}
</div>
{% endif %}
<pre style="line-height: 1.25; white-space: pre;">
\ SORRY /

View File

@ -14,24 +14,6 @@
</h1>
</div>
<!-- END header title -->
<!-- BEGIN header nav -->
<div class="rtfd-header-nav">
<ul>
{% if request.user.is_authenticated %}
<li>
<a href="//{{ PRODUCTION_DOMAIN }}/accounts/logout/">{% trans "Log Out" %}</a>
</li>
{% else %}
<li>
<a href="//{{ PRODUCTION_DOMAIN }}/accounts/login/">{% trans "Log in" %}</a>
</li>
{% endif %}
</ul>
</div>
<!-- END header nav -->
</div>
</div>
<!-- END header-->

View File

@ -0,0 +1,30 @@
{% extends "projects/project_edit_base.html" %}
{% load i18n %}
{% block title %}{% trans "Environment Variables" %}{% endblock %}
{% block nav-dashboard %} class="active"{% endblock %}
{% block editing-option-edit-environment-variables %}class="active"{% endblock %}
{% block project-environment-variables-active %}active{% endblock %}
{% block project_edit_content_header %}
{% blocktrans trimmed with name=environmentvariable.name %}
Environment Variable: {{ name }}
{% endblocktrans %}
{% endblock %}
{% block project_edit_content %}
<p>
{% blocktrans trimmed %}
The value of the environment variable is not shown here for sercurity purposes.
{% endblocktrans %}
</p>
<form method="post" action="{% url 'projects_environmentvariables_delete' project_slug=project.slug environmentvariable_pk=environmentvariable.pk %}">
{% csrf_token %}
<input type="submit" value="{% trans "Delete" %}">
</form>
{% endblock %}

View File

@ -0,0 +1,22 @@
{% extends "projects/project_edit_base.html" %}
{% load i18n %}
{% block title %}{% trans "Environment Variables" %}{% endblock %}
{% block nav-dashboard %} class="active"{% endblock %}
{% block editing-option-edit-environment-variables %}class="active"{% endblock %}
{% block project-environment-variables-active %}active{% endblock %}
{% block project_edit_content_header %}{% trans "Environment Variables" %}{% endblock %}
{% block project_edit_content %}
<form
method="post"
action="{% url 'projects_environmentvariables_create' project_slug=project.slug %}">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" value="{% trans "Save" %}">
</form>
{% endblock %}

View File

@ -0,0 +1,47 @@
{% extends "projects/project_edit_base.html" %}
{% load i18n %}
{% block title %}{% trans "Environment Variables" %}{% endblock %}
{% block nav-dashboard %} class="active"{% endblock %}
{% block editing-option-edit-environment-variables %}class="active"{% endblock %}
{% block project-environment-variables-active %}active{% endblock %}
{% block project_edit_content_header %}{% trans "Environment Variables" %}{% endblock %}
{% block project_edit_content %}
<p>Environment variables allow you to change the way that your build behaves. Take into account that these environment variables are available to all build steps.</p>
<div class="button-bar">
<ul>
<li>
<a class="button"
href="{% url 'projects_environmentvariables_create' project_slug=project.slug %}">
{% trans "Add Environment Variable" %}
</a>
</li>
</ul>
</div>
<div class="module-list">
<div class="module-list-wrapper">
<ul>
{% for environmentvariable in object_list %}
<li class="module-item">
<a href="{% url 'projects_environmentvariables_detail' project_slug=project.slug environmentvariable_pk=environmentvariable.pk %}">
{{ environmentvariable.name }}
</a>
</li>
{% empty %}
<li class="module-item">
<p class="quiet">
{% trans 'No environment variables are currently configured.' %}
</p>
</li>
{% endfor %}
</ul>
</div>
</div>
{% endblock %}

View File

@ -0,0 +1,6 @@
<p>Your project, {{ project.name }}, is currently using a legacy incoming webhook to trigger builds on Read the Docs. Effective April 1st, 2019, Read the Docs will no longer accept incoming webhooks through these endpoints.</p>
<p>To continue building your Read the Docs project on changes to your repository, you will need to configure a new webhook with your VCS provider. You can find more information on how to configure a new webhook in our documentation, at:</p>
{% comment %}Plain text link because of text version of email{% endcomment %}
<p><a href="https://docs.readthedocs.io/en/latest/webhooks.html#webhook-deprecated-endpoints">https://docs.readthedocs.io/en/latest/webhooks.html#webhook-deprecated-endpoints</a></p>

View File

@ -0,0 +1 @@
Your project, {{ project.name }}, needs to be reconfigured in order to continue building automatically after April 1st, 2019. For more information, <a href="https://docs.readthedocs.io/en/latest/webhooks.html#webhook-deprecated-endpoints">see our documentation on webhook integrations</a>.

View File

@ -0,0 +1,8 @@
<p>Your project, {{ project.name }}, is currently using GitHub Services to trigger builds on Read the Docs. Effective January 31, 2019, GitHub will no longer process requests using the Services feature, and so Read the Docs will not receive notifications on updates to your repository.</p>
<p>To continue building your Read the Docs project on changes to your repository, you will need to add a new webhook on your GitHub repository. You can either connect your GitHub account and configure a GitHub webhook integration, or you can add a generic webhook integration.</p>
<p>You can find more information on our webhook intergrations in our documentation, at:</p>
{% comment %}Plain text link because of text version of email{% endcomment %}
<p><a href="https://docs.readthedocs.io/en/latest/webhooks.html#webhook-github-services">https://docs.readthedocs.io/en/latest/webhooks.html#webhook-github-services</a></p>

View File

@ -0,0 +1 @@
Your project, {{ project.name }}, needs to be reconfigured in order to continue building automatically after January 31st, 2019. For more information, <a href="https://docs.readthedocs.io/en/latest/webhooks.html#webhook-github-services">see our documentation on webhook integrations</a>.

View File

@ -23,6 +23,7 @@
<li class="{% block project-translations-active %}{% endblock %}"><a href="{% url "projects_translations" project.slug %}">{% trans "Translations" %}</a></li>
<li class="{% block project-subprojects-active %}{% endblock %}"><a href="{% url "projects_subprojects" project.slug %}">{% trans "Subprojects" %}</a></li>
<li class="{% block project-integrations-active %}{% endblock %}"><a href="{% url "projects_integrations" project.slug %}">{% trans "Integrations" %}</a></li>
<li class="{% block project-environment-variables-active %}{% endblock %}"><a href="{% url "projects_environmentvariables" project.slug %}">{% trans "Environment Variables" %}</a></li>
<li class="{% block project-notifications-active %}{% endblock %}"><a href="{% url "projects_notifications" project.slug %}">{% trans "Notifications" %}</a></li>
{% if USE_PROMOS %}
<li class="{% block project-ads-active %}{% endblock %}"><a href="{% url "projects_advertising" project.slug %}">{% trans "Advertising" %} </a></li>

View File

@ -85,4 +85,9 @@ class Backend(BaseVCS):
super().checkout()
if not identifier:
return self.up()
return self.run('bzr', 'switch', identifier)
exit_code, stdout, stderr = self.run('bzr', 'switch', identifier)
if exit_code != 0:
raise RepositoryError(
RepositoryError.FAILED_TO_CHECKOUT.format(identifier)
)
return exit_code, stdout, stderr

View File

@ -159,7 +159,9 @@ class Backend(BaseVCS):
code, out, err = self.run('git', 'checkout', '--force', revision)
if code != 0:
log.warning("Failed to checkout revision '%s': %s", revision, code)
raise RepositoryError(
RepositoryError.FAILED_TO_CHECKOUT.format(revision)
)
return [code, out, err]
def clone(self):
@ -213,8 +215,10 @@ class Backend(BaseVCS):
@property
def commit(self):
_, stdout, _ = self.run('git', 'rev-parse', 'HEAD')
return stdout.strip()
if self.repo_exists():
_, stdout, _ = self.run('git', 'rev-parse', 'HEAD')
return stdout.strip()
return None
def checkout(self, identifier=None):
"""Checkout to identifier or latest."""

View File

@ -108,4 +108,11 @@ class Backend(BaseVCS):
super().checkout()
if not identifier:
identifier = 'tip'
return self.run('hg', 'update', '--clean', identifier)
exit_code, stdout, stderr = self.run(
'hg', 'update', '--clean', identifier
)
if exit_code != 0:
raise RepositoryError(
RepositoryError.FAILED_TO_CHECKOUT.format(identifier)
)
return exit_code, stdout, stderr

View File

@ -0,0 +1,24 @@
-r pip.txt
# Base packages
docutils==0.14
Sphinx==1.8.3
sphinx_rtd_theme==0.4.2
sphinx-tabs==1.1.10
# Required to avoid Transifex error with reserved slug
# https://github.com/sphinx-doc/sphinx-intl/pull/27
git+https://github.com/agjohnson/sphinx-intl.git@7b5c66bdb30f872b3b1286e371f569c8dcb66de5#egg=sphinx-intl
Pygments==2.3.1
mkdocs==1.0.4
Markdown==3.0.1
# Docs
sphinxcontrib-httpdomain==1.7.0
sphinx-prompt==1.0.0
# commonmark 0.5.5 is the latest version compatible with our docs, the
# newer ones make `tox -e docs` to fail
commonmark==0.5.5
recommonmark==0.4.0

View File

@ -1,7 +0,0 @@
-r pip.txt
gunicorn
#For resizing images
pillow
python-memcached
whoosh
django-redis

View File

@ -2,18 +2,6 @@
pip==18.1
appdirs==1.4.3
virtualenv==16.2.0
docutils==0.14
Sphinx==1.8.3
sphinx_rtd_theme==0.4.2
sphinx-tabs==1.1.10
# Required to avoid Transifex error with reserved slug
# https://github.com/sphinx-doc/sphinx-intl/pull/27
git+https://github.com/agjohnson/sphinx-intl.git@7b5c66bdb30f872b3b1286e371f569c8dcb66de5#egg=sphinx-intl
Pygments==2.3.1
mkdocs==1.0.4
Markdown==3.0.1
django==1.11.18
django-tastypie==0.14.2
@ -39,6 +27,8 @@ requests-toolbelt==0.8.0
slumber==0.7.1
lxml==4.2.5
defusedxml==0.5.0
pyyaml==3.13
Pygments==2.3.1
# Basic tools
# Redis 3.x has an incompatible change and fails
@ -94,15 +84,6 @@ djangorestframework-jsonp==1.0.2
django-taggit==0.23.0
dj-pagination==2.4.0
# Docs
sphinxcontrib-httpdomain==1.7.0
# commonmark 0.5.5 is the latest version compatible with our docs, the
# newer ones make `tox -e docs` to fail
commonmark==0.5.5
recommonmark==0.4.0
# Version comparison stuff
packaging==18.0

View File

@ -1,4 +1,5 @@
-r pip.txt
-r local-docs-build.txt
django-dynamic-fixture==2.0.0
pytest==4.0.2

View File

@ -1,6 +1,6 @@
[metadata]
name = readthedocs
version = 2.8.4
version = 2.8.5
license = MIT
description = Read the Docs builds and hosts documentation
author = Read the Docs, Inc

View File

@ -1,6 +1,4 @@
"""
Read the Docs tasks
"""
"""Read the Docs tasks."""
from __future__ import division, print_function, unicode_literals
@ -14,15 +12,28 @@ import common.tasks
ROOT_PATH = os.path.dirname(__file__)
# TODO make these tasks namespaced
# release = Collection(common.tasks.prepare, common.tasks.release)
namespace = Collection(
common.tasks.prepare,
common.tasks.release,
#release=release,
namespace = Collection()
namespace.add_collection(
Collection(
common.tasks.prepare,
common.tasks.release,
),
name='deploy',
)
namespace.add_collection(
Collection(
common.tasks.setup_labels,
),
name='github',
)
namespace.add_collection(
Collection(
common.tasks.upgrade_all_packages,
),
name='packages',
)
# Localization tasks
@task