This is more encapsulated and more efficient.
In the cases where `is_force` is `True` or
`pygments_data.json` is missing, we now avoid
the unnecessary step of importing `pygments`, at
least up front.
(Of course, we probably import that once we generate
the artifacts.)
If somebody is having issues with provision, it's
plausible they'll do something like `git clean -fX`
to clean up old artifacts of earlier provision runs,
as part of debugging things.
We defend against this by detecting the most obvious
symptom as cheaply as possible.
I remove `is_force` from `file_or_package_hash_updated`
and modernize its mypy annotations.
If `is_force` is `True`, we just now run the thing
we want to force-run without having to call
`file_or_package_hash_updated` to expensively
and riskily return `True`.
Another nice outcome of this change is that if
`file_or_package_hash_updated` returns `True`,
you can know that the file or package has
indeed been updated.
For the case of `build_pygments_data` we also
skip an `os.path.exists` check when `is_force`
is `True`.
We will short-circuit more logic in the next
few commits, as well as cleaning up some of
the long/wrapper lines in the `if` statements.
We change the message for skipping RabbitMQ
configuration to match nearby messages:
No need to run `tools/setup/build_pygments_data`.
No need to run `scripts/setup/inline_email_css.py`.
No need to run `scripts/setup/configure-rabbitmq.
No need to regenerate the dev DB.
No need to regenerate the test DB.
No need to run `manage.py compilemessages`.
For upgrade-zulip-from-git to work, we need to be able to run
update-prod-static on production systems, which means provision code
like this cairosvg logic needs to be there for now.
When creating a webhook integration or creating a new one, it is a pain to
create or update the screenshots in the documentation. This commit adds a
tool that can trigger a sample notification for the webhook using a fixture,
that is likely already written for the tests.
Currently, the developer needs to take a screenshot manually, but this could
be automated using puppeteer or something like that.
Also, the tool does not support webhooks with basic auth, and only supports
webhooks that use json fixtures. These can be fixed in subsequent commits.
We figure out the dev host using the same logic as
dev_settings.py, so that we don't use wrong things
like 127.0.0.1 for droplet users.
And we display the link in cyan.
Generated by `pyupgrade --py3-plus --keep-percent-format` on all our
Python code except `zthumbor` and `zulip-ec2-configure-interfaces`,
followed by manual indentation fixes.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
If a request fails the tool sleeps for some time before making
further requests. The sleep time is a random number between
0 and 2^failures capped at 64 seconds. More details about the
algorithm can be found at https://chat.zulip.org/#narrow/stream/
92-learning/topic/exponential.20backoff.20--.20with.20jitter
This is a prep commit for generating /team page data
using cron job. zerver/tests directory is not present in
production installation. So moving the file from the directory
tests to tools.
As described in the commit that added this function, this fixes one
quite annoying bug and one at least in-principle bug:
* On Windows, the simple version (lacking `git update-index
--refresh`) routinely gives false positives, making the tools
that rely on it basically unusable.
* If you have uncommitted changes in the index but manage to have
the worktree nevevertheless match HEAD, the simple version will
give a false negative and we'd blow away those changes.
This is verbatim from Git upstream, at an older version. (The one
change since then is to add localization for the messages like "You
have unstaged changes" -- which complicates the code, is important and
worth it for Git itself, but for our tools we can do without.)
This function will replace our use of `git diff-index --quiet HEAD`
in several scripts. The key differences in behavior are:
* The `git update-index --refresh`. Without this, on Windows
apparently `git diff-index` routinely (but not all the time!)
reports that tons of files have changed. See report:
https://chat.zulip.org/#narrow/stream/9-issues/topic/.2E.2Ftools.2Ffetch-pull-request.20issue/near/834435
* Instead of one command comparing the worktree to HEAD, we
separately compare the worktree to the index and the index to
HEAD, and abort if either diff is nonempty. This one is obvious,
but rather an edge case (it matters only if you've managed to
make the worktree and HEAD agree while the index has some
changes), and the extra code is annoying if written out in every
script that needs it. But that's what a subroutine is for. :-)
We'll make a few tweaks before actually switching to use this.
The Git commands we're invoking to do the real work are useful to
print, for transparency to see what's happening and that there's no
magic here.
The boring shell stuff like `remote=${2:-"upstream"}` is not so
helpful, and nor is the rather arcane and in any case read-only
command `git diff-index --quiet HEAD`. Those only add noise that
obscures the interesting parts. So, move the `set -x` down to when
we're done with the boring preparatory stuff and ready to perform
the commands that do the work.
Add sgrep (sgrep.dev) to tooling and include simple rule as
proof of concept. Included rule detects use of old django render
function.
Also added a rule that looks for if-else statements where both
code paths are identical.
While we could fix this issue by changing the markdown processor,
doing so is not a robust solution, because even a momentary bug in the
markdown processor could allow cached messages that do not follow our
security policy.
This change ensures that even if our markdown processor has bugs that
result in rendered content that does not properly follow our policy of
using rel="noopener noreferrer" on links, we'll still do something
reasonable.
Co-authored-by: Tim Abbott <tabbott@zulipchat.com>
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Folks can have issues connecting to Casper
as zulipdev.com when they are not connected to
the internet or just have a bad connection, since
the DNS record is on the internet. Folks can
work around this by just creating an /etc/hosts
entry for zulipdev.com, but people don't always
know.
This fix moves the symptom slightly earlier in
the process--we don't advertise that the server
is "up" if you can't actually connect to it as
"zulipdev.com".
In the past it has blocked Python library security updates with overly
strict version bounds, and we don’t use it as a library, only as a
binary.
Skip the PROVISION_VERSION bump because we can use the tx binary from
either location.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This reverts commit 36a8e61e67 (#13934).
The Django 2.2 autoreloader works by forking into a child process that
exits with status 3 when a file changes, and a parent process that
restarts the child when it exits with status 3. Setting this
environment variable had the effect of pretending we were already the
child process, without a parent process to restart it. Therefore,
changing any code used by the queue processor caused it to exit rather
than restart.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
python-dev will be depreciated in Focal but can be used as python2-dev
so removed it from common dockerfile.template and added it
as an extra package in .circleci/config.yml.
Previously, we only did apt updates when our sources.list files or
keys changed, which could result in provisioning errors for
development systems that don't routinely update their apt cache
(probably including ~all Vagrant environments).
cgi.escape is deprecated in python3.2 and removed in python3.8.
This function was unsafe because quote is false by default, hence
removed and replaced with a safer html.escape.
Used get_venv_dependencies function to return the correct dependencies
for RHEL, Centos, Fedora rather than importing them as separate
COMMON_YUM_DEPENDENCIES in provision and create-production-venv.
We now restrict emails on the zulip realm, and now
`email` and `delivery_email` will be different for
users.
This change should make it more likely to catch
errors where we leak delivery emails or use the
wrong field for lookups.
Tests for these links often result in rate-limiting from GitHub,
leading to the builds failing in Circle CI. We temporarily mark
github.com/zulip links as external to keep the builds passing.
Added a get_venv_dependencies() function in setup_venv.py which
returns VENV_DEPENDENCIES according to the vendor and os_version.
The reason for adding this function was because python-dev will be
depreciated in Focal but can be used as python2-dev so when adding
support for Focal VENV_DEPENDENCIES should to be os_version dependent.
There were two problems with the previous code-
1) The code glob.glob("scripts/lib/build-") should be
glob.glob("scripts/lib/build-*) otherwise it would always return [].
2) The part of the code where we included scripts/lib/build-* for sha1 sum
check would only run when debian is not in os_families(). This wasn't
correct as we could have a situation where we have to build pgroonga
from source even in case of debian and so we need to improve the
condition on it.
Now since we only have build-pgroonga there its better to just directly hash
its content with the condition of BUILD_PGROONGA_FROM_SOURCE.
This should fix spurious failures, where test-run-dev would occasionally
freeze. What exactly about these changes was causing that is still to
be potentially investigated. This is merely meant as a fix to the
failures.
This reverts commit 19429c3ad7.
We plan to use these records to check and record the schema of Zulip's
events for the purposes of API documentation.
Based on an original messier commit by tabbott.
In theory, a nicer version of this would be able to work directly off
the mypy type system, but this will be good enough for our use case.
Before this test, we were validating the behavior
of `i18next`, but we weren't validating our light
layer that sits on top of `i18next`, which currently
resides in the slightly misnamed `translations.js`
file.
The translations module is now so small that I'll
just quote it verbatim here:
import i18next from 'i18next';
i18next.init({
lng: 'lang',
resources: {
lang: {
translation: page_params.translation_data,
},
},
nsSeparator: false,
keySeparator: false,
interpolation: {
prefix: "__",
suffix: "__",
},
returnEmptyString: false, // Empty string is not a valid translation.
});
window.i18n = i18next;
We now just do `zrequire('translations')` to initialize
the `i18next` library, which allows us to have simpler
test setup and to actually exercise the above call to
`i18next.init`.
This change now gives us 100% line coverage of `translations.js`,
which of course isn't that hard to acheive (see above).
isort 5 knows not to reorder imports across function calls, so this
will stop isort from breaking our code.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This gives them cache-compatible URLs, and also avoids some extra
copies of the sprite sheet images.
Comments on the Octopus emoji added by tabbott.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Now the caller simply imports the debug ‘require’ function as a
module, deciding for itself how to expose it and with what name (in
our case, we expose it as ‘require’ with expose-loader). Also, remove
a stray console.log.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This adds a global require() function that makes JS modules accessible
to the browser console without adding them to the global window
object:
» const typeahead = require("./static/shared/js/typeahead");
» typeahead.popular_emojis
Array(6) [ "1f44d", "1f389", "1f642", "2764", "1f6e0", "1f419" ]
The list of known modules is exposed via the keys of require.ids
object.
This will allow us to migrate more modules to ES6 without losing
access to this debugging functionality.
I’ll probably upload this plugin to NPM at some point, but I figured
I’ll let it bake in-tree first.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This moves some code from settings_display.js
into the new module settings_config.js.
Extracting this module breaks some dependencies
on settings_display.js (which has some annoying
transitive dependencies, including jQuery).
In particular this isolates stream_data from
from settings_display.js.
Two of the three structures that we moved here
weren't even directly used by settings_display.js,
since we do a lot of rendering in the modules
admin.js and setting.js.
We make get_all_display_settings() a function
to avoid a require-time dependency on page_params.
Breaking the dependencies simplifies a few
node tests.
Most of the node test complexity came from the
following commit in March 2019:
5a130097bf
The commit itself seems harmless enough, but
dependencies can have a somewhat "viral" nature,
where making stream_data depend on settings_display
caused us to modify four different node tests.
This cleans up a few things:
- just yield values so we don't have to do
tedious max logic
- use values() instead of items() for
skin_variations loop
In the ideal world the emoji.json would reduce this
code to `get_square_size = lambda data: data['square_size']`,
but I don't think we can get the square size explicitly.
This commit changes the calculation of the
background-size parameter that we use to
render emojis from sprite sheets.
In particular, it now makes the parameter
match the sizes of our latest sprite
sheets from Twitter/Google.
This should fix the geometry aspect of #13959,
but we also need to fix some issues with the
cache being sticky.
There is also some minor cleanup:
- Remove obsolete -moz/-webkit CSS.
- Remove needless precision in percentages.
- Fix the transposed nrows/ncols names.
- Add extensive commenting.
Finally, we add a minor bump to the provision
number. This commit should be merged in the
same series as the other fix for this issue,
which will probably have a major bump, and we'll
need to rebase this appropriately.
While it's a bit of extra complexity to do this check, which I'm not
excited about, we've had multiple folks spend significant time being
confused rebasing past d7d8632525 into
deleting `pygments_data.json`, with provision not rebuilding it, so
this seems worth merging as a transitional fix even if we decide to
remove it in 2 months.
Without calling cov.erase() the data file seems to persist and even
pollute future test runs if not removed. Registering an atexit handler
seems like a good, and reasonably clean way to ensure the cleanup
happens.
Fixes#13933.
This allows us to collect coverage for Handlebars templates, and also
improves the readability of Handlebars-related stack traces.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
In Django 2.2 the autoreload system has changed.
DJANGO_AUTORELOAD_ENV env variable should be set when calling code
that'll use the autoreloader. Otherwise there's some kind of race
condition in the autoreload code when SIGINT is sent, where
restart_with_reloader() (called only if the env variable isn't set)
has the subprocess module calling p.kill() on a process that's already
exited, raising ProcessLookupError and printing an ugly traceback. This
causes non-deterministic test-run-dev failures.
We used to have a block of code doing this just in the presence
endpoint because that's where we'd had error-handling problems with it
not being present, but it seems more correct for it to run
unconditionally on all HTTP requests.
This requires adding a dependency of channel on reload_state, which we
record in the webpack configuration for now.
This should return us to a situation where we won't get blueslip
browser error reporting for users created while a device was offline
just before it reloads.
1) Created a new class `DatabaseType` and access its objects inside
`template_database_status()` instead of sending five arguments with
default values.
2) Made `check_files` and `setting_name` local variables instead of
function parameters since they had same value(None) for every call.
Fixes#13845.
webpack optimizes JSON modules using JSON.parse("{…}"), which is
faster than the normal JavaScript parser.
Update the backend to use emoji_codes.json too instead of the three
separate JSON files.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
I believe we can remove these and rely on
other parts of our testing/code-review
to ensure template quality.
These tests never really exercised our
app code, as evidenced by us not regressing
any of the 100%-line-coverage files.
We have a couple other ways that we verify
the correct format of the templates:
- webpack (can they compile?)
- check-templates (are they nicely indented?)
For deep testing, we have Casper, which
exercises most of our most important templates
in some meaningful way.
I think it's pretty rare that we get bugs
now that are directly caused by bad templates,
and an even smaller subset of them would
have been caught by the node tests.
If that trend changes in the future, I would prefer to
just do something "greenfield" to address
any common problems rather than resurrect
this code, but we could always resurrect it
from git.
The template node tests did check a little bit of
detail about which fields are there, but not
in an integrated way, so that aspect of the tests
wasn't very useful either.
This adds Ubuntu 19.10 as a valid provisioning target.
The release test in setup-apt-repo was changed from a list of values to
a regex check for brevity.
Every CLI program should have a usage message.
Also add a mention in the `push-to-pull-request` usage message of
its participation in the `refs/remotes/pr/` pseudo-remote feature.
This gives us the right behavior when using the `url.*.insteadOf`
mechanism for aliases in Git remote URLs. For example, if
one's ~/.gitconfig has:
[url "git@github.com:"]
insteadOf = gh:
then `git remote add upstream gh:zulip/zulip` will work great, as
the nice, short, mnemonic `gh:` prefix gets expanded to the more
finicky `git@github.com:`. I use just such a prefix routinely.
But the feature does require that scripts go through the right
abstractions. In particular `git remote get-url`, since Git 2.7
(from 2016), exists for exactly this reason. A plain `git config`
command bypasses the expansion, getting the verbatim `gh:...`
version, which doesn't work.
So, switch to that.
As a bonus, we get to behave correctly if for some reason the user
has configured a push URL distinct from the fetch URL for this
remote, just by adding `--push`. With `git config`, we'd have had
to manually implement the fallback from `remote.upstream.pushUrl` to
`remote.upstream.url` in order to properly handle that case.
We prefer this to internal_send_message().
We are trying to deprecate `internal_send_message`,
which has extra moving parts related to
`extract_recipients` and `Addressee.legacy_build`.
There are two chunks of code that I touch here
that look pretty similar, but I'm not quite
sure they're worth de-duplicating, since they
use different topics and different message
content.
Instead of having `notify_new_user` delegate
all the heavy lifting to `send_signup_message`,
we just rename `send_signup_message` to be
`notify_new_user` and remove the one-line
wrapper.
We remove a lot of obsolete complexity:
- `internal` was no longer ever set to True
by real code, so we kill it off as well
as well as killing off the internal_blurb code
and the now-obsolete test
- the `sender` parameter was actually an
email, not a UserProfile, but I think
that got past mypy due to the caller
passing in something from settings.py
- we were only passing in NOTIFICATION_BOT
for the sender, so we just hard code
that now
- we eliminate the verbose
`admin_realm_signup_notifications_stream`
parameter and just hard code it to
"signups"
- we weren't using the optional realm
parameter
There's also a long ugly comment in
`get_recipient_info` related to this code
that I amended for now.
We should try to take action in a subsequent
commit.
We now have 100% line coverage on 71 JS files.
This is thanks to about 150 people who have
contributed code to frontend/node_tests.
And then 126 files are still short of 100% line
coverage.
We now enforce line coverage with a set called
EXEMPT_FILES, which are the files for which
we do NOT expect to have 100% coverage.
Using an exemption list makes it so that adding
a new JS file to the project without 100% line
coverage will cause the build to fail. This will
encourage folks to be intentional about their
lack of test coverage.
If a file that had 100% coverage somehow regressed
to 0% coverage, we would report an error to the
console, but we weren't treating it as an actual
failure.
We've probably always had this bug, but it probably
rarely was an issue, since devs might have seen
the error locally, or hopefully whatever crazy
thing you did to totally remove coverage would
have had other symptoms.
If this was intentional, I suspect it might have
had something to do with wanting to get coverage
reports when you just run individual tests. But
a while back we changed it so that when you run
individual tests, we don't do the line coverage
enforcement.
My upstream remote is named origin, so commit-message-lint was always
complaining at me. Detect the right remote name from the output of
`git remote -v`.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
We now use vdom-ish techniques to track the
list items for the pm list. When we go to update
the list, we only re-render nodes whose data
has changed, with two exceptions:
- Obviously, the first time we do a full render.
- If the keys for the items have changed (i.e.
a new node has come in or the order has changed),
we just re-render the whole list.
If the keys are the same since the last re-render, we
only re-render individual items if their data has
changed.
Most of the new code is in these two modules:
- pm_list_dom.js
- vdom.js
We remove all of the code in pm_list.js that is
related to updating DOM with unread counts.
For presence updates, we are now *never*
re-rendering the whole list, since presence
updates only change individual line items and
don't affect the keys. Instead, we just update
any changed elements in place.
The main thing that makes this all work is the
`update` method in `vdom`, which is totally generic
and essentially does a few simple jobs:
- detect if keys are different
- just render the whole ul as needed
- for items that change, do the appropriate
jQuery to update the item in place
Note that this code seems to play nice with simplebar.
Also, this code continues to use templates to render
the individual list items.
FWIW this code isn't radically different than list_render,
but it's got some key differences:
- There are fewer bells and whistles in this code.
Some of the stuff that list_render does is overkill
for the PM list.
- This code detects data changes.
Note that the vdom scheme is agnostic about templates;
it simply requires the child nodes to provide a render
method. (This is similar to list_render, which is also
technically agnostic about rendering, but which also
does use templates in most cases.)
These fixes are somewhat related to #13605, but we
haven't gotten a solid repro on that issue, and
the scrolling issues there may be orthogonal to the
redraws. But having fewer moving parts here should
help, and we won't get the rug pulled out from under
us on every presence update.
There are two possible extensions to this that are
somewhat overlapping in nature, but can be done
one a time.
* We can do a deeper vdom approach here that
gets us away from templates, and just have
nodes write to an AST. I have this on another
branch, but it might be overkill.
* We can avoid some redraws by detecting where
keys are moving up and down. I'm not completely
sure we need it for the PM list.
If this gets merged, we may want to try similar
things for the stream list, which also does a fairly
complicated mixture of big-hammer re-renders and
surgical updates-in-place (with custom code).
BTW we have 100% line coverage for vdom.js.
This legacy cross-realm bot hasn't been used in several years, as far
as I know. If we wanted to re-introduce it, I'd want to implement it
as an embedded bot using those common APIs, rather than the totally
custom hacky code used for it that involves unnecessary queue workers
and similar details.
Fixes#13533.
responses is an module analogous to httpretty for mocking external
URLs, with a very similar interface (potentially cleaner in that it
makes use of context managers).
The most important (in the moment) problem with httpretty is that it
breaks the ability to use redis in parts of code where httpretty is
enabled. From more research, the module in general has tendency to
have various troublesome bugs with breaking URLs that it shouldn't be
affecting, caused by it working at the socket interface layer. While
those issues could be fixed, responses seems to be less buggy (based
on both third-party reports like ckan/ckan#4755 and our own experience
in removing workarounds for bugs in httpretty) and is more actively
maintained.
In this commit, we basically match any kinda of jinja2 start tag,
no matter its special kind (eg. jinja2_whitespace_stripped_start)
to any kinda jinja2 end tag (eg. jinja2_whitespace_stripped_end)
Idea is special operators like `-` do not change the meaning of
inline tag and thus matching shouldn't depend upon this.
Zulip has had a small use of WebSockets (specifically, for the code
path of sending messages, via the webapp only) since ~2013. We
originally added this use of WebSockets in the hope that the latency
benefits of doing so would allow us to avoid implementing a markdown
local echo; they were not. Further, HTTP/2 may have eliminated the
latency difference we hoped to exploit by using WebSockets in any
case.
While we’d originally imagined using WebSockets for other endpoints,
there was never a good justification for moving more components to the
WebSockets system.
This WebSockets code path had a lot of downsides/complexity,
including:
* The messy hack involving constructing an emulated request object to
hook into doing Django requests.
* The `message_senders` queue processor system, which increases RAM
needs and must be provisioned independently from the rest of the
server).
* A duplicate check_send_receive_time Nagios test specific to
WebSockets.
* The requirement for users to have their firewalls/NATs allow
WebSocket connections, and a setting to disable them for networks
where WebSockets don’t work.
* Dependencies on the SockJS family of libraries, which has at times
been poorly maintained, and periodically throws random JavaScript
exceptions in our production environments without a deep enough
traceback to effectively investigate.
* A total of about 1600 lines of our code related to the feature.
* Increased load on the Tornado system, especially around a Zulip
server restart, and especially for large installations like
zulipchat.com, resulting in extra delay before messages can be sent
again.
As detailed in
https://github.com/zulip/zulip/pull/12862#issuecomment-536152397, it
appears that removing WebSockets moderately increases the time it
takes for the `send_message` API query to return from the server, but
does not significantly change the time between when a message is sent
and when it is received by clients. We don’t understand the reason
for that change (suggesting the possibility of a measurement error),
and even if it is a real change, we consider that potential small
latency regression to be acceptable.
If we later want WebSockets, we’ll likely want to just use Django
Channels.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
I added this tool a few years ago, and I did have
a vision for how it would improve our codebase, but
I can't remember exactly where I was going with it.
At this point the tool is just a little too noisy
to be helpful. An example of it creating confusion
was a recent PR where somebody was patching
user_circle_class in the PM list, and we already
had similar code in the buddy list, because they
use the same CSS. I mean, there was possibly a way
that the code could have been structured to remove
some of the duplication, but it probably would have
just moved the complexity around.
I just don't think it's worth maintaining the tool
at this point.
This experimental setting disables sending private messages in Zulip
in a crude way (i.e. users get an error when they try to send one).
It makes no effort to adjust the UI to avoid advertising the idea of
sending private messages.
Fixes#6617.
Addresses point 1 of #13533.
MissedMessageEmailAddress objects get tied to the specific that was
missed by the user. A useful benefit of that is that email message sent
to that address will handle topic changes - if the message that was
missed gets its topic changed, the email response will get posted under
the new topic, while in the old model it would get posted under the
old topic, which could potentially be confusing.
Migrating redis data to this new model is a bit tricky, so the migration
code has comments explaining some of the compromises made there, and
test_migrations.py tests handling of the various possible cases that
could arise.
This test mostly tests logic that I'm about
to remove in subsequent commits, and it's a bit
messy.
This commit removes 100% line coverage, but I
will restore that a few commits later.
We have ~5 years of proof that we'll probably never
extend Dict with more options.
Breaking the classes into makes both a little faster
(no options to check), and we remove some options
in FoldDict that are never used (from/from_array).
A possible next step is to fine-tune the Dict to use
Map internally.
Note that the TypeScript types for FoldDict are now
more specific (requiring string keys). Of course,
this isn't really enforced until we convert other
modules to TS.
Fixes this error after rebooting the host:
$ sudo ./destroy-all -f
zulip-install-bionic-41MM2
lxc-stop: zulip-install-bionic-41MM2: tools/lxc_stop.c: main: 191 zulip-install-bionic-41MM2 is not running
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The host environment variables (especially PATH) should not be allowed
to pollute the test and could interfere with it.
This allows test-install to run on a NixOS host.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Fixes#13452.
The migration from UserProfile.is_realm_admin/UserProfile.is_guest in
e10361a832 broke our LDAP-based support
for setting a user's role via LDAP properties, which relied on setting
those fields. Because the django-auth-ldap feature powering that only
supports booleans (and in any case, we don't want to expose constants
like `ROLE_REALM_ADMINISTRATOR` to the LDAP configuration interface),
it makes sense to provide setters for these legacy fields for
backwards-compatibility.
We lint against using these setters directly in Zulip's codebase
directly. The issue with using these is that when changing user's
.role we want to create appropriate RealmAuditLog entries and send
events. This isn't possible when using these setters - the log entries
and events should be created if the role change in the UserProfile is
actually save()-ed to the database - and on the level of the setter
function, it's not known whether the change will indeed be saved.
It would have to be somehow figured out on the level of post_save
signal handlers, but it doesn't seem like a good design to have such
complexity there, for the sake of setters that generally shouldn't be
used anyway - because we prefer the do_change_is_* functions.
The purpose of this change is narrowly to handle use cases like the
setattr on these boolean properties.
A bug in Zulip's new user signup process meant that users who
registered their account using social authentication (e.g. GitHub or
Google SSO) in an organization that also allows password
authentication could have their personal API key stolen by an
unprivileged attacker, allowing nearly full access to the user's
account.
Zulip versions between 1.7.0 and 2.0.6 were affected.
This commit fixes the original bug and also contains a database
migration to fix any users with corrupt `password` fields in the
database as a result of the bug.
Out of an abundance of caution (and to protect the users of any
installations that delay applying this commit), the migration also
resets the API keys of any users where Zulip's logs cannot prove the
user's API key was not previously stolen via this bug. Resetting
those API keys will be inconvenient for users:
* Users of the Zulip mobile and terminal apps whose API keys are reset
will be logged out and need to login again.
* Users using their personal API keys for any other reason will need
to re-fetch their personal API key.
We discovered this bug internally and don't believe it was disclosed
prior to our publishing it through this commit. Because the algorithm
for determining which users might have been affected is very
conservative, many users who were never at risk will have their API
keys reset by this migration.
To avoid this on self-hosted installations that have always used
e.g. LDAP authentication, we skip resetting API keys on installations
that don't have password authentication enabled. System
administrators on installations that used to have email authentication
enabled, but no longer do, should temporarily enable EmailAuthBackend
before applying this migration.
The migration also records which users had their passwords or API keys
reset in the usual RealmAuditLog table.
This commit was automatically generated by `tools/lint --only=eslint
--fix`, after an `.eslintrc.json` change.
A half dozen files were removed from the changes by tabbott pending
further work to ensure we avoid breaking valuable PRs with merge
conflicts.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
We'll be soon documenting a production workflow that involves using
it, and that means it needs to live under scripts/ (since tools/ isn't
present in release tarballs).
Upcoming changes in test_generated_curl_examples_for_success modifies
various data of iago user heavily. So it's much easier to run
test_the_api initially than making various changes in tests of
test_the_api function.
This suppresses the mypy message “Success: no issues found in 1085
source files” or “Found 1 error in 1 file (checked 1085 source files)”
in the output of lint.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This tool doesn’t match our current workflow for Python requirements
upgrades as of commit ec9bf6576a (#13213).
It also has a type error with mypy 0.730, which would be easily fixable,
but removing it is easier.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This tool provides no value over `pip list --outdated`. It also has a
type error with mypy 0.730, which would be easily fixable, but
removing it is easier.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Then, find and fix a predictable number of previous misuses.
With a small change by tabbott to preserve backwards compatibility for
sending `yes` for the `forged` field.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
- Moves "Authentication in the development environment" from subsystems
to "development/authentication.md".
- Moves "Renumbering migrations" to a section within "Schema migrations".
Webpack code splitting will make the inclusion order of CSS files less
obvious, and we need to guarantee that these rules follow the rules
they override.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
login_context now gets the social_backends list through
get_social_backend_dicts and we move display_logo customization
to backend class definition.
This prepares for easily adding multiple IdP support in SAML
authentication - there will be a social_backend dict for each configured
IdP, also allowing display_name and icon customization per IdP.
This adds the general machinery required, and sets it up for the file
`typing_status.js` as a first use case.
Co-authored-by: Anders Kaseorg <anders@zulipchat.com>
It happens that commonmark, python-jose, and python-twitter don’t
actually use future on Python 3, and moto uses aws-xray-sdk in such a
way that it doesn’t use future, but this was a weird game to be
playing just to remove one dependency, and it caused CI failures after
new releases of future, so let’s just include it.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Previously, while Django code that relied on EXTERNAL_HOST and other
settings would know the Zulip server is actually on port 9991, the
upcoming Django SAML code in python-social-auth would end up detecting
a port of 9992 (the one the Django server is actually listening on).
We fix this using X-Forwarded-Port.
Apparently, the CircleCI and Codecov links (and the Codecov badge)
weren't pointing specifically at master, so they'd sometimes show
state from the lastest push to a pull request, which isn't a
reasonable way to advertise whether the project's build is passing.
Apparently Tornado decompresses gzip responses by default. Worse, it
fails to adjust the Content-Length header when it does.
https://github.com/tornadoweb/tornado/issues/2743
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The reason that `pip-tools` running on Python 3 didn’t detect the
right requirements for `thumbor` on Python 2 is simply that some of
them are conditional on the Python version.
As for the requirements that had been manually added as a workaround:
`backports-abc` and `singledispatch` are now correctly detected, while
`backports.ssl-match-hostname` was vendored into `urllib3` some time
ago and `certifi` is no longer necessary.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This reverts commit 073ecaac66 (#9365).
This exception handler was overly broad in catching all `OSError`s,
and it made debugging harder by hiding the actual exception.
Furthermore, we no longer use NFS (#12963), and we’re now getting
reports of Windows users running into this message.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Otherwise Bootstrap doesn’t get minified, and also the minification
state is incorrectly reflected in the webpack cache.
The Terser plugin is used by default; we need to include it explicitly
to avoid removing it.
Switch from cssnano to clean-css because it’s noticeably faster.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
To replace DISTRIB_FAMILY, there’s now an os_families function using
the standard ID and ID_LIKE information in /etc/os-release.
Fixes#13070; fixes#13071.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
It’s about as fast as node-sass (faster, according to their
benchmarks) and more flexible. Autoprefixer is neat: we can now go
delete all our -moz-, -webkit-, etc. lines and have them autogenerated
as necessary based on .browserslistrc.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
We no longer use tsearch_extras, and the camo patch is irrelevant on
systemd systems (Xenial and newer). So we no longer need to
provide/install a PPA at all.
Closes#13027.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Now that we're implemented tsearch_extras in pure postgres, we no
longer need a custom extension. This should help us considerably, as
it means we no longer need to ship custom apt packages at all.
Fixes#467.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Follow up of commit 2a1305d. Replace all local variables named 'msgid'
with 'message_id' in all JS and HTML files, and adds a linter rule for
it as well.
Resolves#12952.
It doesn't require scripts to install, allowing us to migrate yarn to
the more secure --ignore-scripts option.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
It’s unclear why pip-tools considers these packages unsafe, and
excluding them from being pinned has resulted in nondeterministic
output that makes our test suite unhappy.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
mypy no longer has a `--quick` option. Its argument parser
autocompletes `--quick` to `--quickstart-file`, leading to a confusing
error message.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
mypy in daemon mode takes some 400 MiB of memory, and cannot follow
imports of type-annotated third-party packages; meanwhile, non-daemon
mode is no longer nearly as slow as it once was.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
futures is no longer there to be removed. Be clear about why we’re
removing future (it was never a “pip-tools bug”), and leave evidence
behind to help indicate how long that will be needed.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Run pip-compile with --quiet so we don’t have to redirect its stderr;
then we can see any exceptions it might throw. Print any resulting
diff in the right order and without extra newlines separating each
line.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
These are not the latest versions, but pip-tools 3.9.0 or 4.0.0 fails
to resolve dependencies from Git URLs:
pip._internal.exceptions.DistributionNotFound: No matching distribution found for zulip==0.6.1_git (from -r requirements/common.in (line 135))
while pip 19.2 breaks pip-tools 3.8.0:
TypeError: __init__() got an unexpected keyword argument 'find_links'
Fixes#10802.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This new test runs each generated curl example against the Zulip API,
checking whether it returns successfully without errors.
Significantly modified by tabbott for simplicity.
django.setup is already called (with different/better environment
variables) inside test_server_running; we shouldn't be calling it just
before that to make imports work.
I discovered this because imports done at the wrong time would
potentially incorrectly have `testserver` as the EXTERNAL_HOST.
This let's us clean up the linter that excludes the use of get_stream
and by adding the access_unchecked in the name we make it clear that
it should be used with caution.
Refactoring idea by Tim Abbott.
Polling for changes every 100 milliseconds was burning enough CPU to
set mid-2015 MacBooks on fire. Use the default inotify watching,
except on filesystems where that’s known not to work (nfs, vboxsf), in
which case polling once per second is more than enough for even the
fastest typers.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
A HEAD response has a Content-Length but no body; it’s not correct in
that case to let Tornado default Content-Length to 0.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
tools/linter_lib/pyflakes.py:35: error: Argument 3 to "run_pyflakes" has incompatible type "List[Tuple[bytes, bytes]]"; expected "List[Tuple[str, str]]"
tools/linter_lib/custom_check.py:110: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:214: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:214: error: Argument "shebang_rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:502: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:502: error: Argument "shebang_rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:519: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:706: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:728: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:738: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:779: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
tools/linter_lib/custom_check.py:779: error: Argument "length_exclude" to "RuleList" has incompatible type "Set[str]"; expected "List[str]"
tools/linter_lib/custom_check.py:803: error: Argument "length_exclude" to "RuleList" has incompatible type "Set[str]"; expected "List[str]"
tools/linter_lib/custom_check.py:805: error: Unsupported operand types for + ("List[Rule]" and "List[Dict[str, Any]]")
tools/linter_lib/custom_check.py:819: error: Argument "rules" to "RuleList" has incompatible type "List[Dict[str, Any]]"; expected "List[Rule]"
These were missed the `zulint` package was missing PEP 561 type
annotation markers, and if it’d had them, mypy daemon mode would’ve
required us to set `follow_imports = skip` for it.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Delete trailing newlines from all files, except
tools/ci/success-http-headers.txt and tools/setup/dev-motd, where they
are significant, and static/third, where we want to stay close to
upstream.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Previous cleanups (mostly the removals of Python __future__ imports)
were done in a way that introduced leading newlines. Delete leading
newlines from all files, except static/assets/zulip-emoji/NOTICE,
which is a verbatim copy of the Apache 2.0 license.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This gives us access to typing_extensions.Deque, which was not added
to typing until 3.5.4.
(PROVISION_VERSION is not bumped because the transitive dependency set
in dev.txt hasn’t changed.)
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This moves our main CSS for rendered Zulip message content into an
external file, which may be reusable but in any case should make it
easier to find this content.
As a result of dropping support for trusty, we can remove our old
pattern of putting `if False` before importing the typing module,
which was essential for Python 3.4 support, but not required and maybe
harmful on newer versions.
cron_file_helper
check_rabbitmq_consumers
hash_reqs
check_zephyr_mirror
check_personal_zephyr_mirrors
check_cron_file
zulip_tools
check_postgres_replication_lag
api_test_helpers
purge-old-deployments
setup_venv
node_cache
clean_venv_cache
clean_node_cache
clean_emoji_cache
pg_backup_and_purge
restore-backup
generate_secrets
zulip-ec2-configure-interfaces
diagnose
check_user_zephyr_mirror_liveness
Otherwise python3 will be perpetually copied from virtualenv to
virtualenv and will never receive updates from the system.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
For .start-button, Bootstrap carousel already supports <button
data-target> as a valid alternative to <button href>. For
.call-to-action, the margin is decreased to exactly offset the lack of
margin collapsing with display: inline-block. There should be no
visual change.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This change serves to declutter webhook-errors.log, which is
filled with too many UnexpectedWebhookEventType exceptions.
Keeping UnexpectedWebhookEventType in zerver/lib/webhooks/common.py
led to a cyclic import when we tried to import the exception in
zerver/decorators.py, so this commit also moves this exception to
another appropriate module. Note that our webhooks still import
this exception via zerver/lib/webhooks/common.py.
When we add Plus, the first sentence should change to "Available on Zulip
Standard and Plus".
I copied the styling of .tip out of expediency, but it's also possible that
long term we'll want only 1 tip-like box styling.
The hover styling is a bit random, but I tried to copy other hover styles I
found in settings.scss.
Note that this renames .upgrade_realm_plan_type_suggestion to .upgrade-tip.
Mismatching imports from outside and inside the virtualenv in the same
process was causing segfaults after apparently benign changes to the
script!
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
There’s no reason to monkey-patch something that we were already
subclassing.
Removing the PRODUCTION conditional causes us to generate
staticfiles.json in the right place to begin with so we don’t need to
move it later. It also allows Django to find staticfiles.json if
running the dev server with PIPELINE_ENABLED.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Otherwise the files aren’t processed by collectstatic and don’t end up
in the staticfiles.json manifest.
Signed-off-by: Anders Kaseorg <andersk@zulipchat.com>
Otherwise the file isn’t processed by collectstatic and doesn’t end up
in the staticfiles.json manifest.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
`valid_indent_html` allows for replacing the incorrectly
indented file with the correct pretty-printed version
if `--fix` is passed to `tools/lint`.
Fixes#12641.
This tool can be used to update the API field of local
zuliprc files for dummy users of development server
(iago, prospero, etc) with the correct API key from database.
This tool can be run after provisioning (or similar tools) which change
the API keys in the database.
This file was unchecked until the .handlebars ↦ .hbs rename, so this
is the easiest way to get tests passing again.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
We had several patches to spectrum, but the only essential one
(0ea770fc18) had already been fixed upstream,
and another was just handling jQuery deprecation warnings for not yet removed features.
See #12749 for details.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
As part of dropping support, we add appropriate error messaging when a
user attempts to provision while using trusty. If the user is running
in Vagrant we append information on how to proceed.
We don’t need a hacked copy anymore. We run the installed version out
of node_modules in development, and a Webpack-bundled version of that
in production.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Moving bootstrap-typeahead from bundles/commons.js to bundles/app.js
and csrf.js from bundles/app.js to bundles/commons.js makes
bundles/commons.js equivalent to the "common" bundle, so we can
replace the latter with the former.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The minimal syntactic sugar it might provide isn’t worth the
unexpected side effects (including side effects on third party
modules).
For now, we allow zrequire to emulate the previous syntax in the Node
test suite, even though stealing part of the NPM namespace is
confusing.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
We don’t have any .tsx files, and nobody expects to be able to omit
the extension when importing .json, .scss, or .css files.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
set iteration order is randomized in Python ≥ 3.3. That might or
might not have had the potential for causing rare probabilistic bugs,
but if nothing else, it made build logs harder to compare.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
As of commit cff40c557b (#9300), these
files are no longer served directly to the browser. Disentangle them
from the static asset pipeline so we can refactor it without worrying
about them.
This has the side effect of eliminating the accidental duplication of
translation data via hash-naming in our release tarballs.
This reverts commit b546391f0b (#1148).
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The VNU_IGNORE whitelist lets in some crazy-invalid preexisting HTML,
but hopefully this will stop the problem from getting much larger.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This way we inherit more of the upstream command’s behavior.
Importantly, this means we pass everything in `opts.spargs` to the
spider, not just `opts.spargs.skip_external`.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Cache-loader is used as an item in the use member so the correct
type should be RuleSetUseItem not RuleSetRule.
See: DefinitelyTyped/DefinitelyTyped/webpack/index.d.ts#L498
Given mini-css-extract-plugin can now do hot module replacement,
this commit also removed css-hot-loader. Not upgrading to 0.7.0
as that cause webpack to crash.
This is a simple, non-intrusive way of removing the bulk of the
clutter from `var/<uuid>/test-backend` after running `test-backend`.
Ideally, we'll replace this logic with proper tearDown methods.
This helps generalize the use of groups inside zulint.
Introduce list_files to return `by_lang` files dict.
Add feature to create custom groups.
Make custom groups for backend and frontend files.
This makes linting rules in zulint more general. Make necessary
changes in tools/lint and tools/custom_check.py to run with the new
RuleList class.
Modify tests for `RuleList` class. Tests only include minor changes to
test with the new class.
Previously, it didn't properly update the stamp files that determine
our caching behavior, so if one ran test-backend afterwards, nothing
would happen.
A secondary issue that this commit does not fix is that provision will
end up rerunning the whole thing.
This fixes an issue where one could end up with a `(` in the markdown
syntax for a link after copy-pasting this, which doesn't work in
markdown.
Fixes#12579.
A function was written in `test_fixtures.py` to drop a test database
template if the corresponding database id doesn't belong to a file.
Alongside this fact, every file that is written is removed after 60
minutes. Meaning any potential database template can never exist
longer than one hour.
This follow-up work was added to deal with the potential race
conditions when running `test-backend`. Ensuring that all templates
are properly dealt with.
Essentially rewritten by tabbott for cleanliness.
Fixes the remainder of #12426.
When running the `./tools/cache-zulip-git-version` script on Travis, the script
fails because Travis gets a shallow clone of the repository, and not a full
clone. This commit changes the script to fail gracefully, if we are unable to
get the version information using `git describe`.
When the command fails, it still writes an empty `zulip-git-version` and that
has not been changed to keep creation of the release tarball simple, and
avoiding a check for whether the file has any content. The code that sets
`ZULIP_VERSION` checks whether the contents of the `zulip-git-version` file are
empty, before setting `ZULIP_VERSION`. So, the version should never be set to an
empty string.
More modern Linux versions like Bionic will block this, and what we
actually want to do is just run the code in our <<EOF block via bash,
so we should do that explicitly.
The integrations page had css in both `landing-page.scss` and
`portico.scss`. With this commit, the styles are mostly unified into
a single separate file.
This moves all the stylesheets like stats, billing etc. to another
directory called `static/styles/portico/`, matching the directory
structure of our JavaScript.
We use `git describe --tags` to get information about the number of commit since
the last major version, and the sha of the current HEAD. This is added to the
ZULIP_VERSION when a deploy is done from `git`.
Modified heavily by punchagan to:
* to use git describe instead of `git log` and `wc`
* use a separate script to run the git describe command
* write the file with version info to var/ and remove it from the repo
Fixes#4685.
Profiling shows that using cache-loader saves ~6-7 seconds of time take
by webpack-dev-server on subsequent runs. The overhaul this adds when
nothing is cached (when running first time) is around 1-2 seconds. We don't
use cache loader for ts-loader since webpack docs says it will slow it down
and file-loader since it just copies files over and caching it would just
was disk space.
This is the second merge of this commit. It fixes the issue with the
previous one by placingn cache-loader after mini-css-loader because it
just extracts css and caching that will make file-loader not run which
in turn makes developement enviorment break.
Profiling shows that using cache-loader saves ~6-7 seconds of time take
by webpack-dev-server on subsequent runs. The overhaul this adds when
nothing is cached (when running first time) is around 1-2 seconds. We don't
use cache loader for ts-loader since webpack docs says it will slow it down
and file-loader since it just copies files over and caching it would just
was disk space.
Profiling data:
-------- Master ---------
~/zulip (master) $ tools/webpack --watch | ts -s '%.S' # master
03.995825 ℹ 「wds」: Project is running at http://127.0.0.1:9994/
03.996161 ℹ 「wds」: webpack output is served from /webpack/
03.996289 ℹ 「wds」: Content not from webpack is served from ...
19.284477 ℹ 「wdm」:
19.285371 ℹ 「wdm」: Compiled successfully.
-------- cache-loader ---------
~/zulip (cache-loader)$ tools/webpack --watch | ts -s '%.S'
04.107913 ℹ 「wds」: Project is running at http://127.0.0.1:9994/
04.108646 ℹ 「wds」: webpack output is served from /webpack/
04.109068 ℹ 「wds」: Content not from webpack is served from ...
12.633782 ℹ 「wdm」:
12.634083 ℹ 「wdm」: Compiled successfully.
Before we used to defined our own type Loader which was
partly incorrect because the use property can only
be string which is incorrect. We use the RuleSetRule type
provided by webpack instead.
Real systemd requires this. docker-systemctl-replacement currently
doesn’t but maybe it will later.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This exchanges a race condition where webpack-dev-server might not be
stopped on a poorly timed KeyboardInterrupt for a less bad race
condition where we might get an UnboundLocalError.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
This should make the run-dev.py user experience a lot nicer when
switching branches away from a branch that is at least as new as this
commit, since we won't need to manually restart run-dev.py to restart
webpack.
Fixes#9042.
This doesn’t seem to add any noise in the normal case, but if anything
shows up here we might want to see it.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
* Remove the custom has_error logic in favor of checking whether any
errors were logged, which gives us a much better chance at catching
unanticipated exceptions.
* Use our error_callback for the initial requests of start_urls too.
* Clean up mypy types.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The jedi package exec()s some code in the context of the fake module
blub, causing errors when generating the coverage report. See
https://github.com/davidhalter/jedi/issues/1122.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The test-backend parallel test runner system doesn't actually use the
zulip_test database; instead, it creates its own databases off the
zulip_test_template database.
We were accidentally running `tools/generate_fixtures` even when there
are no changes, because this function is shared with the
tools/lib/test_server.py codebase, which needs us to do the work of
creating a test database for it off the zulip_test_template database.
Fixing this saves about 1.5s / 4s of the runtime of a single test.
This restores man pages and other documentation that have been
stripped from the default Ubuntu cloud image and installs
ubuntu-minimal and ubuntu-standard.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
perfect-scrollbar replaces both the appearance and the behavior of the
scrollbar, and its emulated behavior will never feel native on most
platforms. SimpleBar customizes the appearance while preserving the
native behavior.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
Upgrades to the stripe library can sometimes break semantics for our
billing system, and so we should make sure to use our documented
testing process before doing them.
Using sys.exit(1) in a management command makes it impossible
to unit test the code in question. The correct approach to
do the same thing in Django management commands is to raise
CommandError.
This commit adds a new developer tool: The "integrations dev panel"
which will serve as a replacement for the send_webhook_fixture_message
management command as a way to test integrations with much greater ease.
This lets us handle directly in our tooling the user experience that
we document for exporting a realm with member consent (before, it
required unpleasant manual work).
This commit migrates the Subscription's notification fields from a
BooleanField to a NullBooleanField where a value of None means to
inherit the value from user's profile.
Also includes a migrations to set the corresponding settings to None
if they match the user profile's values. This migration helps us in
getting rid of the weird "Apply to all" widget that we offered on
subscription settings page.
The mobile apps can't handle None appearing as the stream-level
notification settings, so for backwards-compatibility we arrange to
only send True/False to the mobile apps by applying those defaults
server-side. We introduce a notification_settings_null value within a
client_capabilities structure that newer versions of the mobile apps
can use to request the new model.
This mobile compatibility code is pretty effectively tested by the
existing test_events tests for the subscriptions subsystem.
It was discovered that the '.eslintcache' file was causing eslint to
throw a TypeError after a recent update/addition to the dependencies.
It makes sense to remove this file as part of the provisioning process
to avoid such exceptions.
This commit removes `tools/check-urls`. It was added as
a useful tool in preparation for the Django 1.10 migration.
Since we completed that migration, it is no longer needed.
Fixes#12180.
The number of processes to run the backend tests is currently a
hardcoded value, this commit transistions the default to be based on the
number of logical CPUs available.
This commit adds `stream_ui_updates.js` module. This module
will includes functions which will update different ui elements
(i.e. subscription button, subscriber count).
The github-services model for how GitHub would send requests to this
legacy integration is no longer available since earlier in 2019.
Removing this integration also allows us to finally remove
authenticated_api_view, the legacy authentication model from 2013 that
had been used for this integration (and other features long since
upgraded).
A few functions that were used by the Beanstalk webhook are moved into
that webhook's implementation directly.
This is really a job for an AST parser rather than a pile of regexes;
among other issues, these will still miss violations that span
multiple lines. But, you know, I tried.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Also use psql -e (--echo-queries) in scripts that use ‘set -x’, so
errors can be traced to a specific query from the output.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
`tools/run-dev.py` already backgrounds `tools/webpack` (and deals with
cleaning it up on exit), so there’s no need for `tools/webpack` to
also background the actual `webpack` process. But when running
`tools/webpack` by itself, it’s annoying to clean up the backgrounded
process manually.
Run `webpack` in the foreground, using `os.execvp` so we don’t waste
memory on an intermediate wrapper process.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
Webpack applies special logic to relative paths provided in
`resolve.modules`, and this logic is expected to be used for
`node_modules`. One case where this is important is when
`node_modules/foo` wants to import a different version of package
`bar` than the one at `node_modules/bar`, and so yarn gives it its own
copy at `node_modules/foo/node_modules/bar`.
It would probably be better to avoid screwing with `resolve.modules`
at all, but this at least brings us one step closer to the default of
just `["node_modules"]`.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
activate_this.py has always documented that it should be exec()ed with
locals = globals, and in virtualenv 16.0.0 it raises a NameError
otherwise.
As a simplified demonstration of the weird things that can go wrong
when locals ≠ globals:
>>> exec('a = 1; print([a])', {}, {})
[1]
>>> exec('a = 1; print([a for b in [1]])', {}, {})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 1, in <module>
File "<string>", line 1, in <listcomp>
NameError: name 'a' is not defined
>>> exec('a = 1; print([a for b in [1]])', {})
[1]
Top-level assignments go into locals, but from inside a new scope like
a list comprehension, they’re read out of globals, which doesn’t work.
Fixes#12030.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The dependency visualizer currently only supports JavaScript files,
such as in the `get_js_edges` function, where only the ".js" extension
is supported. Update the visualizer to support ".ts" files as well and
to output modules without their extensions.
Currently, the `test-js-with-node` tests append ".js" to filenames
without an extension. Since Typescript is now also supported, it can
produce results such as "dict.ts.js". To remedy this, check for ".ts"
files as well.
Spider raises exceptions when errors like FileNotFound
are detected. However, these did not set error state
before exiting causing spider to fail silently.
This patch sets the status causing exceptions to exit with
non-zero exit status.
All the inline javascript code present in email_log.html(which is
rendered when the user visits "/emails" in development mode) is
transferred to a new file: email_log.js in portico/ directory.
Fixes#11608.
As of #367, `tools/run-dev-queue-processors` has evolved into nothing
more than an unnecessarily elaborate wrapper around `manage.py
process_queue --all`. Remove it (mostly to make it marginally easier
to Tab-complete `tools/run-dev.py`, if I’m being honest).
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
The delete operator could throw a TypeError when attempting to
remove a non-configurable property, which is rare in practice since
they can only be created using `Object.defineProperty()` and
`Object.freeze()`. We also never uses the output of `del()` anyway.
A new javascript file "dev-login.js" is created in static/js/portico/
and the inline javascipt code present in dev_login.html is transferred
to that file. An empty div element is added in dev_login.html with
unique data-page-id attribute to make it more easy to find in which
page we are, while working with the javascript code.
This generalizes the provision logic for deciding whether to build our
tsearch_extras and pgroonga search extensions from source to support
Ubuntu cosmic as well (and evenutally, other future platforms).
It appears that this code did the right thing despite being written
wrong, probably due to whatever `manage.py collectstatic` does in its
argument parsing. But in any case, we should make the code read how
it's intended.
Accomplished by adding a function to clear the status message with
an empty string. The html is then updated to reflect changes without a
refresh.
Currently, it's a small hassle to clear a status message. This option
makes things a bit easier.
Fixes#11630.
This optimizes test-backend by skipping webhook
tests when run in default mode.
Tweaked by tabbott to extend the documentation and update the CI
commands.
This is mostly adding markup, calling some convenient
functions in buddy_data.js, and adjusting CSS.
To make the circles update dynamically, I mostly
orchestrate this though activity.js for now. It's
possible we'll want to adjust that eventually to
happen through something like a `presence_events`
dispatcher, but that's essentially what
a good part of `activity.js` does now.
Commit 7d12e2019d (#10994) broke fresh
provisions by importing zproject.settings before we were done
modifying settings. Fix it by moving the generate_secrets invocation
to the earliest reasonable place.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
This commit does the following three things:
1. Update stream model to accomodate rendered description.
2. Render and save the stream rendered description on update.
3. Render and save stream descriptions on creation.
Further, the stream's rendered description is also sent whenever the
stream's description is being sent.
This is preparatory work for eliminating the use of the
non-authoritative marked.js markdown parser for stream descriptions.
Instead, manually activate it in the one place where this
functionality was used (tools/lib/provision.py). This way we avoid
trying to activate the Python 2 thumbor virtualenv from Python 3.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
This seems to be a common enough pitfall to justify
a bit of extra handling. Example output:
$ ./tools/run-dev.py
Clearing memcached ...
Flushing memcached...
OK
Starting Zulip services on ports: web proxy: ...
Note: only port 9991 is exposed to the host in a Vagrant environment.
ERROR: You probably have another server running!!!
Traceback (most recent call last):
File "./tools/run-dev.py", line 421, in <module>
app.listen(proxy_port, address=options.interface)
File "/srv/zulip-py3-venv/lib/python3.5/...
server.listen(port, address)
File "/srv/zulip-py3-venv/lib/python3.5/...
sockets = bind_sockets(port, address=address)
File "/srv/zulip-py3-venv/lib/python3.5/...
sock.bind(sockaddr)
OSError: [Errno 98] Address already in use
Terminated
NOTE: If you revert this commit, you want to revert
the immediately prior commit as well. The history
is that Ishan made some improvements to the widget,
but there were some minor bugs. I decided not
to squash the commits together so that the git
history is clear who did what. (In particular, I
want questions about the JS code to come to me if
somebody does `git blame`.)
Anyway...
This is a fairly significant rewrite of the polling
widget, where I clean up the overall structure of
the code (including things from before the prior
fix) and try to polish the prior commit a bit as
well.
There are a few new features:
* We tell "other" users to wait for the poll
to start (if there's no question yet).
* We tip the author to say "/poll foo" (as
needed).
* We add edit controls for the question.
* We don't allow new choices until there's
a question.
Apparently, the missed-message templates have a slightly different
structure from our other email templates, which triggered a latent,
subtle bug in inline-email-css's effort to remove duplicate <html>
blocks from emails that had been generated by premailer. Fix this
bug, and add appropriate assertions to prevent similar issues in the
future.
Fixes#11249.
We still create a Python 2 virtualenv for thumbor but that’s
separate (/srv/zulip-thumbor-venv from
scripts/lib/create-thumbor-venv).
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
As part of this change, we port into the .messages class the work in
4e8e7348da to change overflow-y to auto,
not scroll (skipping that would result in a regression).
Otherwise this causes an error
```
AttributeError: type object 'Callable' has no attribute '_abc_registry'
```
on 3.7. While the error is specific to 3.7, it is safer to uninstall
typing for all the versions that don't require a pip-provided typing
library.
We instead get the specific fields from message
that we use. This is particularly helpful
for subject -> topic migration; we no longer
have to account for "subject" fields in
client-side templates.
The crawler used to be called directly for checking external links.
Now the scrapy command calls the crawl_with_status wrapper.
Crawl_with_status has been modified to pass the external parameter in
the previous commit, so we can now use this simpler approach.
This is a major rewrite of the billing system. It moves subscription
information off of stripe Subscriptions and into a local CustomerPlan
table.
To keep this manageable, it leaves several things unimplemented
(downgrading, etc), and a variety of other TODOs in the code. There are also
some known regressions, e.g. error-handling on /upgrade is broken.
/bin/sh and /usr/bin/env are the only two binaries that NixOS provides
at a fixed path (outside a buildFHSUserEnv sandbox).
This discussion was split from #11004.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
This is a common bug that users might be tempated to introduce.
And also fix two instances of this bug that were present in our
codebase, including an important one in our upgrade code path.
To support this, we add a pass_targets option to the main linter
library, because with current mypy, it's generally counterproductive
to pass the list of files in (can produce spurious errors; isn't
faster).
The testing section is more appropriate, since it's fundamentally part
of our CI system.
While we're at it, fix the fact that we were linking to GitHub, not
ReadTheDocs, in the run-mypy output.
This changes a few things:
* Deplicates deps_to_install logic.
* Has a retry flag, under which we can guard the apt retry print statements.
* Makes the install_system_deps flow more parallel.
The fixture changes are because self.upgrade formerly used to cause a page load
of /billing, which in turn calls Customer.retrieve.
If we ran the full test suite with GENERATE_STRIPE_FIXTURES=True, we would
likely see several more Customer.retrieve.N.json's being deleted. But
keeping them there for now to keep the diff small.
This commit works by vendoring the couple functions we still use from
puppetlabs stdlib (join and range), but removing the rest of the
puppetlabs codebase, and of course cleaning up our linter rules in the
process.
Fixes#7423.
This optimizes tools/provision by not running
`tools/update-authors-json --use-fixture` unless either the script
itself or its fixtures file (zerver/tests/fixtures/authors.json) was
changed.
Fixes#10991.
This release is from 2018-08-22, a little over 100 days ago.
It was the first release with the important fix so that when the
server advises it to stop displaying a notification because the user
has read the message (as the SEND_REMOVE_PUSH_NOTIFICATIONS server
setting enables), the app doesn't instead replace the notification
with a broken one reading "null". We have that setting running now
on chat.zulip.org, and intend to roll it out more broadly soon.
The `# take 0` thing is a slightly absurd workaround for the fact
that our funky out-of-line way of marking lines to ignore doesn't
work right if there are multiple such lines in a given file that
are equal modulo leading and trailing whitespace.
This option causes test-documentation to only verify internal links
that we control and are important to be correct. This prevents
test-documentation from flaking in CI due to issues with the dozens of
third-party blocks that we link to from various parts of our
documentation.
Tweaked by tabbott for comment clarity, and to also include
github.com/zulip links.
Fixes#10942.
The goal here was to enforce 100% coverage on
parse_narrow, but the code has an unreachable line
and is overly tolerant of bogus urls. This will
be fixed in the next commit.
This fixes an actual user-facing issue in our mobile push
notifications documentation (where we were incorrectly failing to
quote the argument to `./manage.py register_server` making it not
work), as well as preventing future similar issues from occurring
again via a linter rule.
We now let color_data keep its own state for
unused_colors, so that we longer have to pass in
a large list of unused_colors every time we want
to assign a new stream color.
This mostly matters at startup, where we might
be cycling through 5000 streams. We claim all
the unused colors up front.
Each operation now has an upper bound of expensiveness,
where the worst case scenario is basically popping
off the first element of a list of <= 24 colors.
The algorithm is now deterministic, too, to make
it easier to test. It's unclear whether random color
assignment ever had much benefit, and it made unit
testing the algorithm difficult. Now we have 100%
line coverage.
Fixes part of #10902.
We've had a few unpleasant bugs with real documentation links being
broken, so we're going to make this more aggressive for now.
I think we instead want a more subtle option for suppressing failures
in some places but not others.
This library was absolutely essential as part of our Python 2->3
migration process, but all of its calls should be either no-ops or
encode/decode operations.
Note also that the library has been wrong since the incorrect
refactoring in 1f9244e060.
Fixes#10807.
This seems like kind of a silly function to extract
to topic.py, but it will theoretically help us sweep
"subject" if we change the DB.
It had test coverage.
Even though you'd think these regexes would be
cached, compiling the regex outside of looping
through lines makes a difference.
My timings are 8.4s -> 6.0s. (You need to hack
on the linter to isolate the custom checks.)
We (lexically) remove "subject" from the conversion code. The
`build_message` helper calls `set_topic_name` under the hood,
so things still have "subject" in the JSON.
There was good code coverage on `build_message`.
Previously, a string ending in "... 😄" was reported as an
error and the linter complained that there should be a space
after the last ':'. This commit changes the pattern so that the
linter only checks for colons that are preceded by an opening
double-quote (").
We now prevent adding "subject" to any code in
zerver/lib, unless you specifically exempt it.
The new set called `FILES_WITH_LEGACY_SUJECT`
is also has comments that give a roadmap of what
to fix.
In tools/pre-commit line 18:
if [ -z "$VIRTUAL_ENV" ] && `which vagrant > /dev/null` && [ -e .vagrant ]; then
^-- SC2092: Remove backticks to avoid executing output.
^-- SC2006: Use $(..) instead of legacy `..`.
^-- SC2230: which is non-standard. Use builtin 'command -v' instead.
In tools/pre-commit line 23:
./tools/lint --no-gitlint --force $changed_files || true
^-- SC2086: Double quote to prevent globbing and word splitting.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/setup/install-aws-server line 25:
zulip_root=${ZULIP_ROOT:-$HOME/zulip}
^-- SC2034: zulip_root appears unused. Verify use (or export if used externally).
In tools/setup/install-aws-server line 40:
if [ -n "$zulip_confdir" ]; then
^-- SC2154: zulip_confdir is referenced but not assigned.
In tools/setup/install-aws-server line 55:
VIRTUALENV_NEEDED=$(if $(echo "$type" | grep -q app_frontend); then echo -n yes; else echo -n no; fi)
^-- SC2091: Remove surrounding $() to avoid executing output.
In tools/setup/install-aws-server line 60:
SSH_OPTS=(-o HostKeyAlgorithms=ssh-rsa)
^-- SC2191: The = here is literal. To assign by index, use ( [index]=value ) with no spaces. To keep as literal, quote it.
In tools/setup/install-aws-server line 69:
ssh "${SSH_OPTS[@]}" "$server" -t -i "$amazon_key_file" -lroot <<EOF
^-- SC2087: Quote 'EOF' to make here document expansions happen on the server side rather than on the client.
In tools/setup/install-aws-server line 86:
ssh "${SSH_OPTS[@]}" "$server" -t -i "$amazon_key_file" -lroot <<EOF
^-- SC2087: Quote 'EOF' to make here document expansions happen on the server side rather than on the client.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/setup/postgres-init-dev-db line 10:
ROOT_POSTGRES="sudo -i -u "$DEFAULT_USER" psql"
^-- SC2027: The surrounding quotes actually unquote this. Remove or escape them.
In tools/setup/postgres-init-dev-db line 46:
echo 'ERROR: Try `sudo service postgresql start`?'
^-- SC2016: Expressions don't expand in single quotes, use double quotes for that.
In tools/setup/postgres-init-dev-db line 64:
PGPASS_ESCAPED_PREFIX="*:\*:\*:$USERNAME:"
^-- SC1117: Backslash is literal in "\*". Prefer explicit escaping: "\\*".
^-- SC1117: Backslash is literal in "\*". Prefer explicit escaping: "\\*".
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/django-template-graph line 10:
for t in $(find -name '*.html' -printf '%P\n'); do
^-- SC2044: For loops over find output are fragile. Use find -exec or a while read loop.
^-- SC2185: Some finds don't have a default path. Specify '.' explicitly.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/deploy-branch line 17:
[ $? -ne 0 ] && error_out "Unknown branch: $branch"
^-- SC2181: Check exit code directly with e.g. 'if mycmd;', not indirectly with $?.
In tools/deploy-branch line 23:
if [ $? -eq 0 ]; then
^-- SC2181: Check exit code directly with e.g. 'if mycmd;', not indirectly with $?.
In tools/deploy-branch line 35:
[ $? -ne 0 ] && error_out "Rebase onto origin/master failed"
^-- SC2181: Check exit code directly with e.g. 'if mycmd;', not indirectly with $?.
In tools/deploy-branch line 39:
[ $? -ne 0 ] && error_out "Push of master to origin/master failed"
^-- SC2181: Check exit code directly with e.g. 'if mycmd;', not indirectly with $?.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/commit-msg line 9:
if [ $(grep '^[^#]' .git/COMMIT_EDITMSG --count) -ne 0 ]; then
^-- SC2046: Quote this to prevent word splitting.
In tools/commit-msg line 10:
lint_cmd="cd ~/zulip && cat \"$1\" | python -m gitlint.cli"
^-- SC2089: Quotes/backslashes will be treated literally. Use an array.
In tools/commit-msg line 11:
if [ -z "$VIRTUAL_ENV" ] && `which vagrant > /dev/null` && [ -e .vagrant ]; then
^-- SC2092: Remove backticks to avoid executing output.
^-- SC2006: Use $(..) instead of legacy `..`.
^-- SC2230: which is non-standard. Use builtin 'command -v' instead.
In tools/commit-msg line 14:
$lint_cmd
^-- SC2090: Quotes/backslashes in this variable will not be respected.
In tools/commit-msg line 17:
if [ $? -ne 0 ]; then
^-- SC2181: Check exit code directly with e.g. 'if mycmd;', not indirectly with $?.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/clean-branches line 33:
echo -n "Deleting local branch $(echo "$ref" | sed 's!^refs/heads/!!')"
^-- SC2001: See if you can use ${variable//search/replace} instead.
In tools/clean-branches line 41:
echo -n "Deleting local branch $(echo "$ref" | sed 's!^refs/heads/!!')"
^-- SC2001: See if you can use ${variable//search/replace} instead.
In tools/clean-branches line 49:
remote_name="$(echo "$ref" | sed 's!^refs/remotes/origin/!!')"
^-- SC2001: See if you can use ${variable//search/replace} instead.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/build-release-tarball line 50:
for i in `cat "$TMPDIR/$prefix/tools/release-tarball-exclude.txt"`; do
^-- SC2013: To read lines rather than words, pipe/redirect to a 'while read' loop.
^-- SC2006: Use $(..) instead of legacy `..`.
In tools/build-release-tarball line 51:
rm -r --interactive=never "$TMPDIR/$prefix/$i";
^-- SC2115: Use "${var:?}" to ensure this never expands to / .
In tools/build-release-tarball line 97:
echo; echo -ne "\033[33mRunning update-prod-static failed. "
^-- SC1117: Backslash is literal in "\0". Prefer explicit escaping: "\\0".
In tools/build-release-tarball line 98:
echo -e "Check $TMPDIR/update-prod-static.log for more information.\033[0m"
^-- SC1117: Backslash is literal in "\0". Prefer explicit escaping: "\\0".
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/build-docs line 3:
cd "$(dirname "$0")"/../docs
^-- SC2164: Use 'cd ... || exit' or 'cd ... || return' in case cd fails.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
This commit adds a test for the payload that is generated when
a Task is moved from one user story to another on Taiga's Sprint
Taskboard UI.
This commit also gets up this webhook's test coverage up to 100%.
We drop support for usage of `icon-vector` as base class when
including icons from font awesome icons package.
Now on, only icons as specified in font awesome v4.7.0 can be used
in the code base.
This module makes it really easy to create are-you-sure
dialogs for dangerous operations.
Basically it's one function with five parameters. You
give three chunks of HTML, a callback function, and
a parent container.
The first use of this will be in settings_user_groups,
coming up in a couple commits.
IFTTT allows custom templating for their payloads, so the onus is
on the user to ensure that their custom templates conform to the
expectations outlined in our IFTTT webhook docs. For that reason,
these payloads weren't generated, but were manually edited.
After discovering a couple of bugs, I decided to thoroughly test
and rewrite this integration from scratch. The older code wasn't
generating coherent messages.
This also commit gets this integration up to 100% test coverage.
Test coverage was improved by removing an unused function and
removing some code (written by me) that was actually handling
Test Hook event types incorrectly.
It was a painful amount of work to generate the actual payload.
Since the only difference was a small build URL, I manually
edited the payload and used that for testing.
This commit gets our GitHub webhook up to 100% test coverage.
Note that Freshdesk allows custom templating for outgoing payloads
in their webhook UI. Therefore, the payloads added in this commit
did not have to be official payloads from Freshdesk.
The lack of coverage was due to:
* An unused function that was never used anywhere.
* get_commit_status_changed_body was using a regex where it didn't
really need to use one. And there was an if statement that
assumed that the payload might NOT contain the URL to the commit.
However, I checked the payload and there shouldn't be any instances
where a commit event is generated but there is no URL to the commit.
* get_push_tag_body had an `else` condition that really can't happen
in any payload. I verified this by checking the BitBucket webhook
docs.
We shouldn't just ignore exceptions when encoding the incoming
auth credentials. Even if the incoming credentials are properly
encoded, it is better to know when that is the case or if
something else fails.
Apparently, Travis removed the Heroku bundle of packages from their
servers, which made the build start failing when trying to configure
apt to hold their versions (sigh). This commit removes the
problematic packages.
The companion tool `tools/reset-to-pull-request` has a handy feature
to maintain a local ref tracking the PR: e.g., pr/1234 for PR 1234.
If this were a normal remote-tracking branch maintained by `git fetch`,
it'd get updated on `git push`. Do the same thing here.
This helps keep a view like `gitk --all @` a bit tidier, by causing
merged PRs to stop pointing at side branches of the main history.
Before this change, the way we loaded
webpack for various tools was brittle.
First, I addressed test-api and test-help-documentation.
These tools used to be unable to run standalone on a
clean provision, because they were (indirectly)
calling tools/webpack without the `--test` option.
The problem was a bit obscure, since running things
like `./tools/test-backend` or `./tools/test-all` in
your workflow would create `./var/webpack-stats-test.json`
for the broken tools (and then they would work).
The tools themselves weren't broken; they were the
only relying on the common `test_server_running` helper.
And even that helper wasn't broken; it was just that
`run-dev.py` wasn't respecting the `--test` option.
So I made it so that `./tools/run-dev` passes in `--test` to
`./tools/webpack`.
To confuse matters even more, for some reason Casper
uses `./webpack-stats-production.json` via various
hacks for its webpack configuration, so when I fixed
the other tests, it broke Casper.
Here is the Casper-related hack in zproject/test_settings.py,
which was in place before my change and remains
after it:
if CASPER_TESTS:
WEBPACK_FILE = 'webpack-stats-production.json'
else:
WEBPACK_FILE = os.path.join('var', 'webpack-stats-test.json')
I added similar logic in tools/webpack:
if "CASPER_TESTS" in os.environ:
build_for_prod_or_casper(args.quiet)
I also made the helper functions in `./tools/webpack` have
nicer names.
So, now tools should all be able to run standalone and not
rely on previous tools creating webpack stats files for
them and leaving them in the file system. That's good.
Things are still a bit janky, though. It's not completely
clear to me why `test-js-with-casper` should work off of
a different webpack configuration than the other tests.
For now most of the jankiness is around Casper, and we have
hacks in two different places, `zproject/test_settings.py` and
`tools/webpack` to force it to use the production stats
file instead of the "test" one, even though Casper uses
test-like settings for other things like which database
you're using.
We don't use input.create_non_editable_pill() in our
code yet. If we add this back, we'll want to have node
tests on it.
Removing this unused code brings us to 100% line
coverage for input_pill.js.
This directly reverts 5c11ab85 with the small addition
of adding input_pill to our list of fully covered
modules.
These repositories (`zulip-ios-legacy` and `zulip-android`) are
deprecated, and as such should not have their own tabs, but still
should be included in the total contributions count.
Apparently, `puppet-lint` on Ubuntu trusty throws warnings for certain
quoting patterns that are OK in modern `puppet-lint`. I believe the
old Zulip code was actually correct (i.e. the old `puppet-lint`
implementation was the problem), but it seems worth changing anyway to
suppress the warnings.
We also exclude more of puppet-apt from linting, since it's
third-party code.
Instead of using a hardcoded value for spritesheet dimensions,
automatically calculate it using `emoji_data`. This will free
us from updating it only emoji datasource update as well as
allow us to add google blob emojiset.
Since this class was built, folks have always chosen
to subclass JsonableError for situations where
the default of ErrorCode.BAD_REQUEST is insufficient.
So now we simplify the use cases, which also gets
us 100% coverage on this core module.
For building Zulip in an environment where a custom CA certificate is
required to access the public Internet, one needs to be able to
specify that CA certificate for all network access done by the Zulip
installer/build process. This change allows configuring that via the
environment.
We start to use puppet-lint to lint puppet modules by default by
adding it to tools/lint (which controls our linter tool chain).
We also define a few puppet-lint rules to exclude.
Fixes: #9185.
This will help us in reducing the size of the release tarball
significantly. I have refrained from changing the `EMOJISETS`
constant in the `emoji_setup_utils.py` as that controls the
emojisets that we want to support. Since we want to re-enable
the feature of changing emojisets sometime again in the future
that variable should be kept as it is as it controls several
other things like emoji scripts that we use to generate emoji
names. Changing it might cause hard to catch bugs.
`emoji-datasource` package v4.0.4 introduced the concept of qualified
and non-qualified emoji codes. As chat programs don't need to use
emoji representation selector, so we used migrated our infrastructure
to use non-qualified emoji codes. But we missed the fact that the
emoji file names in emoji farm are based on emoji data's 'unified'
field and the value of this field has changed. Consequently the image
file names must also have been changed. We used `emoji_code` while
converting the span tags to img tags while processing notifications.
But since now `emoji_code` refers to non-qualified code while image
file names are based on qualified code, we need to rename images
to correctly do the conversion. This commit just fixes this.
Right now it only has one function, but the function
we removed never really belonged in actions.py, and
now we have better test coverage on actions.py, which
is an important module to get to 100%.
This ancient tool predates our practice of collecting test fixtures
for third-party integrations, which is a better general system for the
problem this solved.
After the messages have been imported, set the rendered_content of the
messages instead of leaving its value to be 'None'.
This is important to ensure that:
(1) Performance for users is good after completing the import.
(2) The database's full-text indexes have all of the imported messages
(which only happens properly when Message rows have their
rendered_content field edited).
Fixes#9168.
Now reading API keys from a user is done with the get_api_key wrapper
method, rather than directly fetching it from the user object.
Also, every place where an action should be done for each API key is now
using get_all_api_keys. This method returns for the moment a single-item
list, containing the specified user's API key.
This commit is the first step towards allowing users have multiple API
keys.
In tools/update-locked-requirements line 66:
compile_requirements requirements/prod.in $OUTPUT_BASE_DIR/prod.txt
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/update-locked-requirements line 67:
compile_requirements requirements/dev.in $OUTPUT_BASE_DIR/dev.txt
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/update-locked-requirements line 68:
compile_requirements requirements/mypy.in $OUTPUT_BASE_DIR/mypy.txt
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/update-locked-requirements line 69:
compile_requirements requirements/docs.in $OUTPUT_BASE_DIR/docs.txt
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/update-locked-requirements line 70:
compile_requirements requirements/thumbor.in $OUTPUT_BASE_DIR/thumbor.txt py2
^-- SC2086: Double quote to prevent globbing and word splitting.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/travis/production-helper line 24:
if ! apt-get dist-upgrade -y $APT_OPTIONS; then
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/travis/production-helper line 26:
apt-get dist-upgrade -y $APT_OPTIONS
^-- SC2086: Double quote to prevent globbing and word splitting.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/test-migrations line 18:
echo "$new_auto_named_migrations" | sed 's/\[[x ]\] / /'
^-- SC2001: See if you can use ${variable//search/replace} instead.
In tools/test-migrations line 27:
echo 'ERROR: Migrations are not consistent with models! Fix with `./tools/renumber-migrations`.'
^-- SC2016: Expressions don't expand in single quotes, use double quotes for that.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/test-install/destroy-all line 31:
| while read c
^-- SC2162: read without -r will mangle backslashes.
In tools/test-install/install line 57:
installer_dir="$(readlink -f $INSTALLER)"
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/test-install/lxc-wait line 30:
for i in {1..60}; do
^-- SC2034: i appears unused. Verify use (or export if used externally).
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/test-documentation line 6:
echo -e "\e[${color_code}m${message}\e[0m" >&2
^-- SC1117: Backslash is literal in "\e". Prefer explicit escaping: "\\e".
^-- SC1117: Backslash is literal in "\e". Prefer explicit escaping: "\\e".
In tools/test-documentation line 41:
scrapy crawl_with_status documentation_crawler $loglevel
^-- SC2086: Double quote to prevent globbing and word splitting.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/test-all-docker line 7:
source /home/zulip/.bash_profile
^-- SC1091: Not following: /home/zulip/.bash_profile: openBinaryFile: does not exist (No such file or directory)
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/test-all line 7:
TEMP=`getopt -o f --long force -- "$@"`
^-- SC2006: Use $(..) instead of legacy `..`.
In tools/test-all line 24:
echo "Running $@"
^-- SC2145: Argument mixes string and array. Use * or separate argument.
In tools/test-all line 26:
printf "\n\e[31;1mFAILED\e[0m $@\n"
^-- SC2059: Don't use variables in the printf format string. Use printf "..%s.." "$foo".
^-- SC1117: Backslash is literal in "\n". Prefer explicit escaping: "\\n".
^-- SC1117: Backslash is literal in "\e". Prefer explicit escaping: "\\e".
^-- SC1117: Backslash is literal in "\e". Prefer explicit escaping: "\\e".
^-- SC2145: Argument mixes string and array. Use * or separate argument.
^-- SC1117: Backslash is literal in "\n". Prefer explicit escaping: "\\n".
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/start-dockers line 7:
source /home/zulip/.bash_profile
^-- SC1091: Not following: /home/zulip/.bash_profile: openBinaryFile: does not exist (No such file or directory)
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/reset-to-pull-request line 25:
git fetch "$remote" +"pull/$request_id/head":"$target_ref"
^-- SC2140: Word is of the form "A"B"C" (B indicated). Did you mean "ABC" or "A\"B\"C"?
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/provision line 13:
FAIL="\033[91m"
^-- SC1117: Backslash is literal in "\0". Prefer explicit escaping: "\\0".
In tools/provision line 14:
WARNING="\033[93m"
^-- SC1117: Backslash is literal in "\0". Prefer explicit escaping: "\\0".
In tools/provision line 15:
ENDC="\033[0m"
^-- SC1117: Backslash is literal in "\0". Prefer explicit escaping: "\\0".
In tools/provision line 19:
PARENT_PATH=$( cd "$(dirname "${BASH_SOURCE}")" ; pwd -P )
^-- SC2128: Expanding an array without an index only gives the first element.
In tools/provision line 32:
if [ $failed = 1 ]; then
^-- SC2086: Double quote to prevent globbing and word splitting.
In tools/provision line 49:
echo 'or just close this shell and start a new one (with Vagrant, `vagrant ssh`).'
^-- SC2016: Expressions don't expand in single quotes, use double quotes for that.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/optimize-svg line 3:
if [ `node_modules/.bin/svgo -f static/images/integrations/logos | grep -o '\.[0-9]% = ' | wc -l` -ge 1 ]
^-- SC2046: Quote this to prevent word splitting.
^-- SC2006: Use $(..) instead of legacy `..`.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
In tools/find-unused-css line 5:
if [ $(git grep "$n" | grep -v '^static/styles/zulip.css' | wc -l) -eq 0 ]; then
^-- SC2046: Quote this to prevent word splitting.
^-- SC2126: Consider using grep -c instead of grep|wc -l.
In tools/find-unused-css line 6:
echo $n
^-- SC2086: Double quote to prevent globbing and word splitting.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>