Emojis which are represented by a sequence of codepoints or emojis
with ZWJ are not included until we implement a mechanism for dealing
with their unicode versions.
Fixes: #6279.
While running the mypy script we were not passing the `--force`
argument correctly to the run-mypy which was causing it complain
about provision status.
Tweaked by tabbott to not remove it from lister.py, linter_lib, and
friends, since those are intended to support both Python 2 and 3
(we're planning to extract them from the repository).
This hack was used to fix the broken number emojis in the emoji picker.
It was broken because of the partial migration to the iamcal dataset.
See issue #4775 for more details.
This hack was used to fix the broken flag emojis in emoji-picker.
It was broken due to the incomplete migration to iamcal dataset.
See issue #4775 for more details.
This commit switches to use sprite sheets for rendering emojis
in all the remaining places, i.e., message bodies and composebox
typeahead. This commit also includes some changes to notifications.py
file so that the spans used for rendering emojis can be converted
to corresponding image tags so that we don't break the emoji rendering
in missed message emails since we can't use sprite sheets there.
As part of switching the bugdown system to use sprite sheets, we need
to switch the name_to_codepoint mappings to match the new sprite
sheets. This has the side effect of fixing a bunch of emoji like
numbers and flag emoji in the emoji pickers.
Fixes: #3895.
Fixes: #3972.
These are long enough to still be self-explanatory (the only one I'm
at all in doubt about there is DEBG; I avoided "DBUG" because it reads
"BUG" which suggests a high-priority message, and those are the
opposite of that), while saving a good bit of horizontal space
vs. padding everything to the 8 characters of "CRITICAL".
Also add a linter exception to allow easy-to-read alignment here,
similar to several existing exceptions for other alignment cases.
This doesn't yet do much, but it gives us a suitable place to
add code to customize how log messages are displayed, beyond what
a format string passed to the default formatter can do.
Printing the version in Travis builds will help in debugging when we
get different results there from locally. The new `--version` path
also gives us a handy place to put the "what mypy command are we running"
diagnostic, getting it out of the way of normal interactive use.
Most CLI tools (including GNU tools and Mercurial) use lowercase
sentence fragments, with no period, for option glosses, so we
follow that style. Also make the voice and wording consistent,
and the quote type in the Python source.
This exclusion was getting snuck in at the end bypassing --all (and so
giving the lie to the --help documentation). There is no "stubs"
directory in the tree in any case.
argparse has reasonable default behavior for the `dest` argument,
and for `default` at least in these two obvious cases.
So cut those out where we're just doing the default thing.
Also rewrap a couple of calls to fit at least in 100 columns.
The pattern test method `test_rule_patterns` tests each rule by
fetching two strings from it: `test_good` and `test_bad`. Each
string is then presented as an input file to `custom_check_file`,
which should return True or False.
All lines in a string need to end with `\n`. Since the linter
expects an additional newline at the end of a file, the test case
adds `\n` to each string on top of that.
Fixes#6320.
This function was used get a black and white glyph for an emoji if there
was no corresponding image file present in the `NotoColorEmoji.ttf` but
due to the new emoji farm setup code, we no longer need this.
This function was used to color transparent number emojis. We no
longer need to do this since now we have remapped number emojis
to the corresponding colored emojis in the new emoji farm.
We have symlinked the old emoji farm to the old emoji farm and hence
we don't use the images from the `NotoColorEmoji.ttf` file. This
function was used to generate a map from codepoint to filename by
parsing the ttx file passed to it. We no longer need this map.
This commit extracts out the `generate_map_files()` function from
the `dump_emojis()` function. This function generates various data
files like `emoji_codes.js`, `name_to_codepoint.json` etc which are
used by webapp, bugdown etc.
This commit removes the old emoji farm generation code in favor of
`setup_old_emoji_farm()`. Instead of having individual images in old
emoji farm we now symlink them to the images in the new emoji farm.
This commit adds `setup_old_emoji_farm()` function to the build_emoji
script. This will change the way of setting up the old emoji farm.
Earlier we used to extract the glyphs corresponding to each emoji from
the `NotoColorEmoji.ttf` file. But since now we already have individual
images in the new emoji farm(from iamcal's 'emoji-datasource-google' npm
package) we can just symlink old emoji files to the new image files. This
apart from helping us in cleaning up the `build_emoji` script will also
help in reducing the increased size of the release tarball due to the
addition of new emoji farm.
This commit implements support for copying over static files
for all bots in the zulip_bots package to
static/generated/bots/ during provisioning. This directory
isn't tracked by Git. This allows us to have access to files
stored in an arbitrary zulip_bots package directory somewhere
on the system. For now, logo.* and doc.md files are copied over.
This commit should act as a starting point for extending our
macro-based Markdown framework to our bots/API packages'
documentation and eventually rendering these static files
alongside our webhooks' documentation.
This enforces our use of a consistent style in how we access Python
modules; "from os.path import dirname" is a particularly popular
abbreviation inconsistent with our style, and so it deserves a lint
rule.
Commit message and error text tweaked by tabbott.
Fixes#6543.
We want to convert stream names to stream ids as close
to the "edges" of our system as possible, so we let our
caller do the work of finding the stream id for a stream
narrow.
Previously these tests required you to run them with the root of the
Zulip repository as the current working directory, just due to
sloppiness.
We clean this up, while also making the path handling more consistent
and involving less fragile code.
Fixes#4169.
Previously, this was its own separate test script; now it's a normal
part of the test suite.
Tweaked by tabbott to use a proper test method.
Fixes#6327.
My first version of this just replaced the repeated list of two output
files with an array variable, but I decided `"${outputs[@]}"` was too
much to ask people to understand, and the alternative of `$outputs`,
unquoted, encourages bad habits of shell programming. So just handle
one file at a time; the only at all expensive part here is `pip-compile`.
I am tempted to move this to Python, but holding back.
This makes the code a little more concise and also more uniform,
treating `future` the same in prod and in dev. The non-uniformity
looks like an oversight in 2be8a793e, one of the commits that updated
this code for the Python 3-only world.
A couple of remarks still reflected the Python 2 world; and the
"Remove the editable flag" comment was confusingly above a line
that it looks like it could be talking about, but isn't.
There are four regexes which try to ensure that the i18n strings are
properly captured.
1) The one which disallows multiline strings.
```
i18n\.t\([^)]+[^,\{\)]$
// Disallows:
i18n.t('some '
+ 'text');
```
2) The one which disallows concatenation within argument to i18n.t():
```
i18n\.t\([\'\"].+?[\'\"]\s*\+
// Disallows:
i18n.t("some " + "text");
```
3) There are two which disallow concatenation with i18n.t():
```
i18n\.t\(.+\).*\+
// Disallows:
i18n.t('some text') +
\+.*i18n\.t\(.+\)
// Disallows:
+ i18n.t('some text')
```
The ideal case is that you try to bring the string argument to the
i18n.t() on one line. In case this is not possible, you can do the
following:
```
var1 = i18n.t("Some text to be translated");
var2 = i18n.t("Some more text to be translated");
complete = var1 + var2;
This is needed in order to mock the method when testing
`custom_check.py`. The diff for this commit is a bit broken;
all it really does is moving the method out of `build_custom_checkers`.
Travis enables different Python versions through virtual environments,
but it seems that there is a little caveat when we try to create Zulip's
virtual environment by referring Travis' virtual environment; Zulip's
virtual environment refers the system Python. We encountered this
behaviour when we tried to run our backend test suite under Python 3.5
in Travis. 'python3 --version' command before activating Zulip's
virtualenv showed 'Python 3.5.3' and after it showed 'Python 3.4.3'.
This happened when we created the virtual environment using
'virtualenv -p python3'.
The solution seems to be to explicitly give the path of the Python
interpreter in the Travis' virtual environment using 'which python3'.
This adds snakeviz to dev tools and also updates the message displayed
upon running `test-backend` with `--profile` option to say how to run
snakeviz correctly when using vagrant development environment.
Since these usually result from changes to HTML templates and other
frontend-side things, it seems better to group them with the frontend.
[Tweaked by gnprice in whitespace and comments.]
We were getting pyflakes lint error output without line numbers like
this:
pyflakes | if user_profile.is_realm_admin and
pyflakes | ^
pyflakes |
Apparently the cause was that stdout and stderr was getting mixed
badly, creating "unused import"s lines that had the first of that
error (containing the line number) just above.
As a result, printing out the lines of output from pyflakes' merged
stdout/stderr feed looked like this:
b"zproject/settings.py:95: 'from .prod_settings import *' used; unable to detect undefined nameszerver/views/users.py:49:39: invalid syntax\n"
Note the lack of newline in between the end of the first error at
"names" and the start of the second at "zerver".
This appears to be a change in Pyflakes behavior when we switched to
Python 3; probably they're missing a flush() somewhere.
This replaces the old footer that has one section with a small list of
items. This expands the footer to have multiple sections.
Actual content tweaked and tagged for i18n by tabbott.
This is hacky, but I can't figure out another way to do it that
doesn't cause other problems.
Ideally, we'd add some sort of exclude rule to our HTML template
linter so we can check the rest of the file.
`pathlib2` is a backport of pathlib to Python 2.x. Ni!
This dependency can be removed since:
- Zulip 1.6.0+git has been on Python 3 nowadays.
Ekki-ekki-ekki-ptang-zoom-boing.
- As stated in ticket #6211, having this removed, prevents the need to
have lockfiles for each of 3.4, 3.5, (and maybe 3.6).
This fixes#6211.
First, all the lines of py3_dev.txt except for mypy.txt are mv-ed to
dev.txt. Then dev_lock.txt is generated from dev.txt to be used by
py3_dev.txt. `click` is removed from moto.txt since it is already a
dependency of several libraries and will appear as an autogenerated
dependency.
Also unconditionally use the `mypy` from our virtualenv --
that's how we ensure we use a common version across different
Zulip developers and in CI.
And as a side effect of cutting some Python 2 vs. Python 3 logic,
fix a bug where `--all` was having no effect.
We had been forcing provision to Python 3 in dev. Now that everything
is Python 3 and the `tools/lib/provision.py` shebang reflects that, we
can just invoke the script directly like everything else.
This causes `upgrade-zulip-from-git`, as well as a no-option run of
`tools/build-release-tarball`, to produce a Zulip install running
Python 3, rather than Python 2. In particular this means that the
virtualenv we create, in which all application code runs, is Python 3.
One shebang line, on `zulip-ec2-configure-interfaces`, explicitly
keeps Python 2, and at least one external ops script, `wal-e`, also
still runs on Python 2. See discussion on the respective previous
commits that made those explicit. There may also be some other
third-party scripts we use, outside of this source tree and running
outside our virtualenv, that still run on Python 2.
We can't fully support it until we fix the tsearch_extras availability
issue, but for now, this is an improvement.
Tweaked by tabbott to cover the outstanding tsearch_extras issue.
Checking for whether tests appear is going to be a more stable
long-term check than looking for zilencer, which we might eventually
include in releases.
Our previous dependencies on the `/root/zulip` path should all be
long gone at this point. Run our production-install test suite through
a fresh temporary path instead, mainly just to avoid causing any confusion
over whether that's quite the case.
We're about to do this (a) in a number of places mentioning
system packages like `python3-dev` to install, and (b) in the
shebangs of every script in the tree.
This consists of the `zulip_ops::stats` Puppet class, which has apparently
not been used since 2014, and a number of files that I believe were
only used for that. Also a couple of tiny loose ends in other files.
This follows up on 207cf6302 from last year to clean up cases that
have apparently popped up since then. Invoking the scripts directly
makes a cleaner command line in any case, and moreover is essential
to how we control running a Zulip install as either Python 2 or 3
(soon, how we always ensure it runs as Python 3.)
One exception: we're currently forcing `provision` in dev to run
Python 3, while still running both Python 2 and Python 3 jobs in CI.
We use a non-shebang invocation to do the forcing of Python 3.
This adds the authors to the Zulip repository on GitHub from
/authors/ along with re-styling the page to fit the same
aesthetic as /for/open-source/ and other product-pages.
Category 'All' -> text 'Filter by category'; icon chevron right when
the dropdown is closed, icon chevron down when the dropdown is open
All other categories -> text CATEGORIES[state.category]; icon chevron down
I think soon we'll put the Python 3 venv at `/srv/zulip-venv` and
make `/srv/zulip-py3-venv` just an alias to that (then remove the
alias too), but for now the old name is helpful for spotting places
that need an update.
The `/srv/zulip-venv` name still appears in codepaths used by Travis
tests.
From the line in tools/provision it should trickle to the rest of the
scripts. This works since almost all the python scripts have been linted
to be generic.
Proof that the setup is python3 only: with this commit, within the
vagrant container env, /srv/zulip-venv is no longer present and
`./tools/run-dev.py` runs just fine.
[gnprice: Added `rm -f` and warning message, and made small edits.]
This was causing `language_options.json` and the `django.mo` files
never to get updated by `upgrade-zulip-from-git` after first deploy.
The stale `django.mo` files meant that any message string which wasn't
in the original from-tarball installation of the server would never
get translated.
The same bug exists in several other spots in this file. Some of
our repetitive-yet-subtly-different uses of `cp` should probably
turn into just a list of arguments to one `rsync -R` command.
Also this should probably really use `rsync --delete` -- for example,
in prod on zulipchat.com we still have lying around a `locale/zh_CN`
directory, even though in a freshly-built static tree there's only
`locale/zh_Hans` instead.
This is an old script that could be used to literally print the Zulip
codebase to a PDF. It hasn't been run since ~2013, and probably never
will be used again. Since it's probably not interesting enough to be
its own open source project, we should just delete it.
Since new mypy versions frequently break old versions, this check is a
useful way to help prevent problems where mypy output is wrong.
We could probably tighten this by checking explicitly the expected
mypy version from requirements.txt, but that's work.
Replaces #5026.
It's hard to find literature with the community tone we're going for, that
is consistent with the Zulip code of conduct, etc.
This commit removes the special tooling for Gutenberg plays, and changes the
text to be some mixture of scigen, Communications From Elsewhere,
chat.zulip.org, and various books from the public domain.
Create a generator script to pull lines from a play, enhancing
random lines with emoji, Markdown and other flair.
With numerous contributions from Rein Zustand and Tim Abbott to finish
the project.
Fixes: #1666.
This file hasn't reflected the actual configuration of any live
installation for some time, nor been part of any tests or other
mechanism to regularly validate it, so it's naturally fallen
behind as we make changes to the set of settings and typically
don't update this file accordingly. Just remove it; all the
documentation functions it serves are already served just as
well by prod_settings_template.py and its ample comments.
Fixes#5887. It seems there's some sort of issue in CPython where it
can get confused into thinking a `.pyc` file that's actually stale is
up to date -- perhaps when they date from the same second, say from
the middle of a rebase.
For now, rather than dig further to fix this properly and be sure of
having really done so, just go back to wiping out all `.pyc` files.
The impact is about 1s; that's noticeable when running a short test
file on a quad-core machine (about 8s), but not so much on a more
typical dev machine or on a larger set of tests.
This new module tracks the recent topic names for any given
stream.
The code was pulled over almost verbatim from stream_data.js,
with minor renames to the function names.
We introduced a minor one-line function called stream_has_topics.
Notably, this adds our checks on translated message strings to
`tools/test-all`, so that they don't cause surprise failures in
CI after a branch is pushed. (Alternately they could have gone
in `tools/lint` to accomplish the same goal, but `makemessages`
which they depend on is quite slow -- on my machine it takes 7s,
compared to 10s for all of `tools/lint`.)
static/ serves static files which get copied around per deploy. Since
the webpack stats files need a consistent name and change per deploy,
they can't live in static/.
This fixes a bug that preventing downgrading a Zulip server to an old
version.
This allow the webbpack dev server to properly reload JavaScript modules
while running in dev without restarting the server. We need to connect
to webpack-dev-server directly because SockJS doesn't support more than
one connection on the same host/port.
This hack saved a lot of time debugging weird issues back in 2016, but
it's no longer needed.
Anyone rebasing a branch from 2016 will need to provision afterwards
regardless, which will fix this issue automatically, and more
importantly, these changes were made obsolete when we moved to the
cached `node_modules` model.
Updates `get_success_stamp()` function to use the `emoji-datasource`
package's version while calculating success stamp so that an emoji
cache rebuild gets triggered automatically if the version is changed.
Apparently, this was missing on Xenial, and we just never noticed
because the package was installed by default on Ubuntu.
It doesn't exist yet in Trusty.
This tool was used for downloading sprite sheets from iamcal's
repository. Since now we have moved to using `emoji-datasource`
npm package, this tool is no longer required.
This commit does the following things:
* Instead of using a manual tool for downloading sprite sheets, use
`emoji-datasource` npm package.
* Modify the `build_emoji` script to use sprite sheets from the npm
package.
Bumps PROVISION_VERSION.
Fixes: #4730.
NPM packages should be installed at the beginning of the provisioning
process so that later in the provisioning process if a script requires
any NPM package it can use it. Earlier, we were installing NPM packages
in the last as the installation process can fail due to network issues
but since we now retry in case the installation fails, they can be
installed safely at the beginning of the process as well just like apt
packages.
This mostly sets the stage for a subsequent commit to start
using client_message_id as the key into sent_messages.
It has the nice side effect of making it more explicit that
certain things should always happen when transmit_message()
succeeds.
This commit does regress our node test coverage a bit.
This is mostly straightforward moving of code out of compose.js.
The code that was moved currently supports sending time
reports for sent messages, but we intend to grow out the new
module to track more state about sent messages.
The following function names in this commit are new, but their
code was basically pulled over verbatim:
process_success (was process_send_time)
set_timer_for_restarting_event_loop
clear
initialize
All the code in the new module is covered by previous tests that
had been written for compose.js. This commit only modifies
a few things to keep those tests.
The new module has 100% node coverage, so we updated `enforce_fully_covered`.
This redesigns the /help/ page sets to be a single page app that uses
history.pushState to work the same as the old app.
The big new feature is that now we have the index in a nicely designed
left sidebar.
This works around a frequent problem we've been having with Travis
CI's apt repository being busted on startup. This, in turn, caused
`apt-get install moreutils` to fail.
We attempt to fix this by adding our own retry logic.
This optimizes .pyc files removal.
Instead of removing all pyc files, it will only remove pyc files where
corresponding py file is not found. This saves about 1s in the
startup time of `run-dev.py` (and thus also `test-js-with-casper`).
Fixes#5760.
This system hasn't been in active use for several years, and had some
problems with it's design. So it makes sense to just remove it to declutter
the codebase.
Fixes#5655.
Now that we have decided to move our API bindings to a separate
repo and specify them as a pip dependency, we can stop modifying
PATH to include the api/ directory.
The commit is composed of: (1) distill out top-level dependencies in
common.txt, (2) add -e flag to the vcs-based packages because
pip-compile can't do without, (3) pip-compile/generate the locked files
then remove the -e flags from the lockfile, (4) pin pathlib2 to dev.txt
because it turns out it is a direct requirement of
documentation_crawler, (5) document the structure and add an automation
script (6) remove cryptography==1.9 from requirements/scrapy.txt since
cryptography is automatically added from pyopenssl (7) add sed command
to remove future/futures from the generated lock file in python3 (this
should have been automatically handled by pip-compile, so pending for
the feature from pip-compile)
Tweaked by tabbott to update PROVISION_VERSION and add a missing
`first` dependency.
In preparation for the upcoming linter output update, each custom
linter now has a identifier argument. This would be used for separating
each line of lints, eg:
py | Missing space around "%" at analytics/views.py line 485:
py | output += '<hr>%s\n'%(string_id,)
css | 64c64,65
css | < opacity: 1; }
css | ---
css | > opacity: 1;
css | > }
css | static/styles/lightbox.css seems to be broken:
swagger | In static/swagger/zulip.yaml:
swagger | Duplicate operationId: registerQueue
For performance reasons, we spawn each linter in a separate OS thread.
The downside of this is that all lints would end up in stdout without
much visual separation, resulting in confusing error log. This commit
introduce the `print_err` function, which shows which linter each line
of lint is from.
The file ioloop_logging.py is pretty small, but it seems to have
flaky time-sensitive code that creates noise for coverage deltas
when we do builds. Since it's small, we just exclude it.
Tweaked by tabbott to add comments.
In this commit we basically start to override the request method of
httplib2.Http() to raise an exception whenever it is called i.e.
a trial is made to access the network from test suits.
Fixes: #1472.
This brings the total no-op provision time down to around 7.3s!
Tweaked by tabbott to clean up the code, close file descriptions, and
avoid issues with a somehow corrupted file.
Fixes#5185.
This commit adds a new linter which runs from tools/travis/backend.
It runs over the translations.json file and checks if any of the
translatable string contains handlebars in it.
Fixes#5544
The Swagger specification indicates that all operationId values should be
unique. However, SwaggerParser doesn't complain during validation if that
doesn't happen, so this commit adds our own method to identify these
cases.
Also, the violations of this rule have been fixed.
Handlebars allows putting tildes (~) by the braces in mustache tags to
strip whitespaces on that side of the tag.
{{#if foo~}}
<p>Bar</p>
{{~/if}}
This way, the linter ignores any potential tilde at the ends of Handlebars
tags so we can use this feature without lint fails.
In this commit, we are adding a new tool which can be used to
clear out the emoji cache. Cache cleanup is a good thing to in
general and as well helps keep travis cache small (meaning faster
builds).
Credits to @HarshitOnGitHub as well for suggesting the use of
emoji symlink.
In order for the `include_only` linter rule to not have
any side effects, we need to explicitly add a trailing '/'
after every directory we want to include.
The main thing here is to make looping over lines be the inner
loop, instead of looping over rules. This keeps regexes in
cache, and it also avoids some O(N) checks.
This is a significant speedup for me, reducing time from 16s
to 11s.
The problem this commit solves is related to how we search for
testcases in the test files. We use a simple 'search_key' in file_data.
This will return a false positive if there is a 'longer_search_key' in
file_data.
While searching for TestCases we should use a longer key to remove the
possibility of collision. Using 'class TestCase(' should be precise
enough.
Fixes#4983
In this commit we add a wrapper around the provisioning inside
setup-backend to trigger another try at provisioning if it fails
in hopes of it will run this time fine if it was interuppted by a
network issue the first time.
Fixes: #2034.
This commit also makes the following changes:
* Bumps the version of Django to 1.11.2.
* Fixes the HTTP response headers. Now CommonMiddleware sets Content-Length
header for non-streaming response, see
https://docs.djangoproject.com/en/1.11/ref/middleware/#module-django.middleware.common.
Due to this, 'Transfer-Encoding: chunked' header is removed, which signifies
a streaming response.
In this commit we basically stop ignoring 'base.html' from linting
point of view. Basically the condition was implemented in a bad way
resulting in ignorance of every template ending with `base.html`.
As well since removing it doesn't complain about the base.html
non parsable(We have made progress with our linter) we can safely
remove the filter condition.