Move Zulip backend tests to zerver.tests.

This commit is contained in:
Tim Abbott 2016-04-11 22:16:09 -07:00
parent 1bf644369f
commit be96cf809d
21 changed files with 16 additions and 16 deletions

2
.gitattributes vendored
View File

@ -14,7 +14,7 @@
/zproject/local_settings.py export-ignore
/zproject/test_settings.py export-ignore
/zerver/fixtures export-ignore
/zerver/tests.py export-ignore
/zerver/tests export-ignore
/frontend_tests export-ignore
/node_modules export-ignore
/humbug export-ignore

View File

@ -465,7 +465,7 @@ time debugging a test failure, e.g.:
```
./tools/lint-all # Runs all the linters in parallel
./tools/test-backend zerver.test_bugdown.BugdownTest.test_inline_youtube
./tools/test-backend zerver.tests.test_bugdown.BugdownTest.test_inline_youtube
./tools/test-js-with-casper 10-navigation.js
./tools/test-js-with-node # Runs all node tests but is very fast
```

View File

@ -76,7 +76,7 @@ Tests
=====
+------------------------+-----------------------------------+
| ``zerver/test*.py`` | Backend tests |
| ``zerver/tests/`` | Backend tests |
+------------------------+-----------------------------------+
| ``frontend_tests/node`` | Node Frontend unit tests |
+------------------------+-----------------------------------+

View File

@ -71,14 +71,14 @@ Here's how we recommend doing it:
usually just have more complex parsing which can obscure what's
common to all webhook integrations.
* Then write a test for your fixture in `zerver/test_hooks.py`, and
* Then write a test for your fixture in `zerver/tests/test_hooks.py`, and
you can iterate on the tests and webhooks handler until they work,
all without ever needing to post directly from the server you're
integrating to your Zulip development machine. To run just the
tests from the test class you wrote, you can use e.g.
```
test-backend zerver.test_hooks.PagerDutyHookTests
test-backend zerver.tests.test_hooks.PagerDutyHookTests
```
See

View File

@ -11,7 +11,7 @@ javascript, based on marked (`static/js/echo.js`), and is used to
preview and locally echo messages the moment the sender hits enter,
without waiting for round trip from the server. The two
implementations are tested for compatibility via
`zerver/test_bugdown.py` and the fixtures under
`zerver/tests/test_bugdown.py` and the fixtures under
`zerver/fixtures/bugdown-data.json`.
The javascript implementation knows which types of messages it can

View File

@ -62,8 +62,8 @@ process.
**Testing:** There are two types of frontend tests: node-based unit tests and
blackbox end-to-end tests. The blackbox tests are run in a headless browser
using Casper.js and are located in ``zerver/tests/frontend/tests/``. The unit
tests use Node's ``assert`` module are located in ``zerver/tests/frontend/node/``.
using Casper.js and are located in ``frontend_tests/casper_tests/``. The unit
tests use Node's ``assert`` module are located in ``frontend_tests/node_tests/``.
For more information on writing and running tests see the :doc:`testing
documentation <testing>`.

View File

@ -53,8 +53,8 @@ it. On Ubuntu:
Backend Django tests
--------------------
These live in ``zerver/tests.py`` and ``zerver/test_*.py``. Run them
with ``tools/test-backend``.
These live in ``zerver/tests/tests.py`` and
``zerver/tests/test_*.py``. Run them with ``tools/test-backend``.
Web frontend black-box casperjs tests
-------------------------------------

View File

@ -30,7 +30,7 @@ if __name__ == "__main__":
(options, args) = parser.parse_args()
if len(args) == 0:
suites = ["zerver"]
suites = ["zerver.tests"]
else:
suites = args

View File

@ -573,13 +573,13 @@ def profiled(func):
"""
This decorator should obviously be used only in a dev environment.
It works best when surrounding a function that you expect to be
called once. One strategy is to write a test case in zerver/tests.py
and wrap the test case with the profiled decorator.
called once. One strategy is to write a backend test and wrap the
test case with the profiled decorator.
You can run a single test case like this:
# edit zerver/tests.py and place @profiled above the test case below
./tools/test-backend zerver.RateLimitTests.test_ratelimit_decrease
# edit zerver/tests/test_external.py and place @profiled above the test case below
./tools/test-backend zerver.tests.test_external.RateLimitTests.test_ratelimit_decrease
Then view the results like this:

View File

@ -138,7 +138,7 @@ class BugdownTest(TestCase):
def load_bugdown_tests(self):
test_fixtures = {}
data_file = open(os.path.join(os.path.dirname(__file__), 'fixtures/bugdown-data.json'), 'r')
data_file = open(os.path.join(os.path.dirname(__file__), '../fixtures/bugdown-data.json'), 'r')
data = ujson.loads('\n'.join(data_file.readlines()))
for test in data['regular_tests']:
test_fixtures[test['name']] = test