The name for_stream_name is more appropriate here. The name
for_stream is more suitable for a function that takes in a Stream
object, which we're about to add.
Our hash-naming of production assets interacted badly with the "look
at files in a directory" algorithm used to determine what sound
options exist for the "notification sound" feature. For lack of a
better solution, we fix this by excluding files with an extra `.` in
their name.
Extracts out common tests so that future social-auth backends can
be tested without duplicating tests. I have been careful to not
change any testing logic.
Add all the stop words to page_params, reading from the
`zulip_english.stop` database, with caching to avoid loading the file
on every page load.
Part of #10592.
This causes changing the email_address_visibility field to actually
modify what user_profile.email values are generated for users, both on
user creation and afterwards as email addresses are edited.
The overall feature isn't yet complete, but this brings us pretty close.
The original commit made a number of well-meaning but suboptimal product
changes to the invoice events, such as threading them under the invoice id
rather than the customer id. It also seems to be causing 500s for
invoiceitem events, though I'm not sure why.
This reverts commit 65489b0391.
We had disabled reference style links in bugdown, however,
we hadn't disabled them in marked. This commit rectifies
that and adds test cases for the same.
Fixes#11350.
This helps keep the realm.json small and easy to process; previously,
almost the entire size of that file was the analytics data.
We implement this by refactoring the analytics Config objects into a
separate subroutine that writes to a separate file, plus the
corresponding import code.
Manual testing was performed by exporting the 'analytics' realm, and
importing back to a newly created 'test' realm. The 'test' realm was
then exported and the json files were inspected. The data appeared
consistent with no abnormalities.
Fixes: #11220.
We eliminated use of this function in outgoing_webhook.py in
bdc95b5d72.
Tweaked by tabbott to also eliminate code only used for that mock.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
This commit does the following three things:
1. Update stream model to accomodate rendered description.
2. Render and save the stream rendered description on update.
3. Render and save stream descriptions on creation.
Further, the stream's rendered description is also sent whenever the
stream's description is being sent.
This is preparatory work for eliminating the use of the
non-authoritative marked.js markdown parser for stream descriptions.
This adds a new API for sending basic analytics data (number of users,
number of messages sent) from a Zulip server to the Zulip Cloud
central analytics database, which will make it possible for servers to
elect to have their usage numbers counted in published stats on the
size of the Zulip ecosystem.
This is primarily a feature for onboarding, where an organization
administrator might send a bunch of random test messages as part of
joining, but then want a pristine organization when their users later
join.
But it can theoretically be used for other use cases (e.g. for
moderation or removing threads that are problematic in some way).
Tweaked by tabbott to handle corner cases with
is_history_public_to_subscribers.
Fixes#10912.
This replaces the current usage of stream names with stream ids.
This commit also removes the `traditional` attribute from the invite
form as now we are sending stream_ids as an argument; this was the
only place in the codebase we used traditional=true, and it's great to
have it removed.
Ever since we implemented support for stream IDs in Addressee,
Addressee.stream_name() can now return None. This commit ensures
that _internal_prep_message only calls ensure_stream when
Addressee.stream_name() is not None.
This commit also contains the following auxiliary changes:
* Adds a custom exception, StreamWithIDDoesNotExist for when
a stream with a given ID does not exist because the error
message returned by StreamDoesNotExist only makes with stream
names, not IDs.
* Adds a new helper, get_stream_by_id_in_realm, which is similar
to get_user_profile_by_id_in_realm (introduced in #10391).
* Adds a helper, validate_stream_id_with_pm_notification, which
returns the Stream object associated with a given ID and also
handles PM notifications to the bot owner if the message was
sent by a bot and if the stream does not exist or has no
subscribers.
* Modifies the message sent by send_pm_if_empty_stream to
accommodate stream IDs.
Note that all of the above changes are required before check_message
can be modified to support stream IDs.
We've had this code oscillated a few times; the original comparison
was added as part of Stride import but broke HipChat import.
c34a8f2e69 fixed HipChat import but
regressed Stride.
This change fixes this for both HipChat + Stride.
Apparently, the "continue to registration" flow used a subtly invalid
way of encoding the full name. We put in the query part of the action
URL of the HTML form, but apparently HTML forms with a `GET` type will
ignore the query part (replacing it with any input values), which
makes sense but doesn't do what we want here. There are a few sane
ways to fix it, but given that the encoding logic we had before for
including the name in the URL was ugly, I'm pretty happy with just
adding a hidden input to the form for the name.
As part of Google+ being removed, they've eliminated support for the
/plus/v1/people/me endpoint. Replace it with the very similar
/oauth2/v3/userinfo endpoint.
This additional logic to prevent resizing is certain circumstances
(file size, dimensions) is necessary because the pillow gif handling
code seems to be rather flaky with regards to handling gif color
palletes, causing broken gifs after resizing. The workaround is to
only resize when absolutely necessary (e.g. because the file is larger
than 128x128 or 128KB).
Fixes#10351.
We had initially designed the poll widget like a blog
post with comments beneath it but it makes more sense
to think of it as just a simple poll with options.
We add a new syntax which converts the messages like the following:
```
/poll Who do you support?
Nadal
- Djokovic
```
to a poll with the two names as options. The list syntax is optional
since anyone making a poll is likely to want to create a list anyway.
This doesn't have any security impact, since we overwrote any other
fields in any case, and also this step happens before the security
part of input validation for stream creation. But this does improve
error messages if one tries to specify other arguments, and also makes
more clear that the `description` argument is supported here.
Refactor the potentially expensive work done by Beautiful Soup into a
function that is called by the alter_content function, so that we can
cache the result. Saves a significant portion of the runtime of
loading of all of our /help/ and /api/ documentation pages (e.g. 12ms
for /api).
Fixes#11088.
Tweaked by tabbott to use the URL path as the cache key, clean up
argument structure, and use a clearer name for the function.
In very old Slack workspaces, slackbot can appear as "Slackbot", and
the import script only checks for "slackbot" (case sensitive). This
breaks the import--it throws the assert that immediately follows the
test. I don't know how common this is, but it definitely affected our
import.
The simple fix is to compare against a lowercased-version of the
user's full name.
Earlier, our realm filters didn't render for languages that do not
use spaces (eg: Japanese) since we used to check for the presence
of an actual space character. This commit replaces that logic with
a complex scheme to detect word boundaries.
Also, we convert the RealmFilterPattern to subclass InlineProcessor
and make use of the new no-op feature in py-markdown 3.0.1 where we
can tell py-markdown that our pattern didn't find a match despite
the initial regex getting matched.
Fixes#9883.
Since we are building our parser from scratch now:
1. We have control over which proccessor goes at what priority number.
Thus, we have also shifted the deprecated `.add()` calls to use the
new `.register()` calls with explicit priorities, but maintaining
the original order that the old method generated.
2. We do not have to remove the processors added by py-markdown that
we do not use in Zulip; we explicitly add only the processors we
do require.
3. We can cluster the building of each type of parser in one place,
and in the order they need to be so that when we register them,
there is no need to sort the list. This also makes for a huge
improvement in the readability of the code, as all the components
of each type are registered in the same function.
These are significant performance improvements, because we save on
calls to `str.startswith` in `.add()`, all the resources taken to
generate the default to-be-removed processors and the time taken to
sort the list of processors.
Following are the profiling results for the changes made. Here, we
build 10 engines one after the other and note the time taken to build
each of them. 1st pass represents the state after this commit and 2nd
pass represent the state after some regex modifications in the commits
that follow by Steve Howell. All times are in microseconds.
| nth Engine | Old Time | 1st Pass | 2nd Pass |
| ---------- | -------- | -------- | -------- |
| 1 | 92117.0 | 81775.0 | 76710.0 |
| 2 | 1254.0 | 558.0 | 341.0 |
| 3 | 1170.0 | 472.0 | 305.0 |
| 4 | 1155.0 | 519.0 | 301.0 |
| 5 | 1170.0 | 546.0 | 326.0 |
| 6 | 1271.0 | 609.0 | 416.0 |
| 7 | 1125.0 | 459.0 | 299.0 |
| 8 | 1146.0 | 476.0 | 390.0 |
| 9 | 1274.0 | 446.0 | 301.0 |
| 10 | 1135.0 | 451.0 | 297.0 |
We avoid re-computing the regex string here, and we
also avoid re-compiling the regex itself.
I decided to put the "one_time" decorator in the
bugdown file itself, just to reduce friction in
folks reading the "buyer beware" comments.
Unfortunately, we can't use this for the
get_web_link_regex() function due to testing concerns,
so that continues to do an inelegant cache-with-global-var
scheme.
We use early-exit to flatten the code.
I also tweaked the comments a bit based on some recent
profile findings. (e.g. reading the file isn't actually
a big bottleneck, it's more the regex itself)
This fixes an annoying bug where clicking to subscribe to a stream
would change the color shown in the "manage streams" UI immediately
after you click.
Fixes#11072.
This adds a setting under "Notification" section of
"Organization settings" tab, which enables Organization administrator to
control whether the missed message emails include the message content or
not.
Fixes: #11123.
Multiple delete message requests for the same message sometimes caused
a 500 error. This happened via the normal IntegrityError being thrown
by delete message/archiving code.
This was manually reproduced by adding latency in function
move_messages_to_archive() in retention.py and
delete_message_backend() in views.py. This addresses the problem by
adding code to handle the exception and throw JsonableError to convert
500 to 400 errors, with an automated test.
Apparently, Clubhouse has two different payloads for story label
changes, one where the label name lives inside the "references"
object, and the other where it lives inside the "actions" object.
Their webhook API is still in beta, so this could just be a bug.
Anyhow, we should support both.
This a check on server side to verify whether the user sending request
to create stream where only admins can post is an admin or not; Raises
a JsonableError when the user is not the realm admin.
You can now pass in an info field with a value
like "out to lunch" to the /users/me/status,
and the server will include that in its outbound
events.
The semantics here are that both "away" and
"status_text" have to have defined values in order
to cause changes. You can omit the keys or
pass in None when values don't change.
The way you clear info is to pass the empty
string.
We also change page_params to have a dictionary
called "user_status" instead of a set of user
ids. This requires a few small changes on the
frontend. (We will add "status_text" support in
subsequent commits; the changes here just keep
the "away" feature working correctly.)
We now have single function that handle both away
and not-away.
This refactoring sets us up to piggyback "info" more
easily onto status updates.
The only thing that changes here is that we don't
delete database rows any more when users revoke
their away status. Instead we just set the status
to NORMAL.
When I was initially writing the tests to solve issue #10131 in PR
2 schema checkers as I modified the code to send the rendered_value
only when required.
When I was using just 1 schema checker shared between two code paths,
we needed _allow_only_listed_keys. But after shifting to 2 schema
checkers for the two different cases, we no longer needed that flag,
and it's better to remove it for a stronger check.
This reverts the temporary fix done in commit
46f4e58782 and replaced it with the fix that
non-admins should be able to see a dropdown to select a non-admin type of
invited user i.e. normal member or guest user.
`fakeldap` assumes every attribute to be a multi-value attribute
while making comparison in `_comapare_s()` and so while making
comparisons for password it gives a false positive. The result
of this was that it was possible to login in the dev environment
using LDAP using a substring of the password. For example, if the
LDAP password is `ldapuser1` even entering `u` would log you in.
On the backend, we extend the BlockQuoteProcessor's clean function that
just removes '>' from the start of each line to convert each mention to
have the silent mention syntax, before UserMentionPattern is invoked.
The frontend, however, has an edge case where if you are mentioned in
some message and you quote it while having mentioned yourself above
the quoted message, you wouldn't see the red highlight till we get the
final rendered message from the backend.
This is such a subtle glitch that it's likely not worth worrying about.
Fixes#8025.
These mentions look like regular mentions except they do not
trigger any notification for the person mentioned. These are
primarily to be used when you make a bot take an action and
the bot mentions you, or when you quote a message that mentions
you.
Fixes#11221.
Apparently, zoom's API will (sometimes?) return a 201 (not 200)
created in response to the API request to create a call. We fix this
by using the proper requests check for whether or not the request
failed.
This commit fixes an error in the logic for allowing admins to edit any
user's CPF (custom profile field) values. The logic allowing users to
edit their own CPF values is however sound. What happens is that of all
the CPF types, for "choice fields" as well as "URL" and "date fields",
when the value is reset/deleted/cleared by the admin in the Admin UI
(organization settings), the frontend would send a null (empty string)
value to the backend for that custom profile field (as this is, after
all, the new value in this case). This would then triggers the backend
validators to return an error message.
We fix this by using the method check_remove_custom_profile_field_value,
that both code paths (user editing their own CPFs and admin editing a
user's CPF) can call.
This moves the logic for deleting the user's custom profile field
value in the remove_user_custom_profile_data view function to a method
named check_remove_user_custom_profile_value in actions.py, so that we
can reuse it in the next commit.
We make this change because setting up reminders in PM's didn't
play really well with our current infrastructure. Basically the
reminder messages from the bot can't appear in the same narrow as
that of a PM between two people and therefore we disable it.
Though we make an exception here where a person wants to set up
reminder for himself.
We do this since we are yet to figure out how the entire realm
internal bots scenerio should work and therefore for the timming
we will use notification bot to deliver the reminders.
Previously, the subscription color attribute had a validator of
check_string, but this is insufficient. Hence this commit update the
validator used to check_color. Fixes#11268.
For unsupported or invalid payloads, we should just raise the
UnexpectedWebhookEventType exception and let our logging system
take care of recording the payload that caused the error.
This commit improves a couple of things:
* All of the message templates are now at the top, a convention
we follow in a lot of our webhooks.
* Messages are not prefixed with any emojis. We don't do this in
any of our other webhooks. Plus, the emojis were outdated.
* Remove some superfluous code.
* Use ```quote <quote goes here> ``` style formatting for
quoted text instead of the `>` character.
This commit only adds support for the four events that have sample
payloads provided for them on the Pagerduty developer website.
Support for the remaining events will be added in subsequent
commits, as we get access to more sample payloads.
Previously, zerver.views.registration.confirmation_key was only
available in development; now we make that more structurally clear by
moving it to the special zerver/views/development directory.
Fixes#11256.
Some urls are only available in the development environment
(dev_urls.py); Corresponding views (here email_log.py) is moved to the
new directory zerver/views/development.
Fixes#11256.
This is a major upgrade, and requires some significant compatibility
work:
* Migrating the pattern-removal logic to use the Registry feature.
* Handling the removal of positional arguments in markdown extensions.
* Handling the removal of safe mode.
This adds language paramater to send_future_email. As a result, this
properly internationalizes invitation reminder emails, by passing
correct language into send_future_email.
Fixes#11240.
We still create a Python 2 virtualenv for thumbor but that’s
separate (/srv/zulip-thumbor-venv from
scripts/lib/create-thumbor-venv).
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
Now, if you pass an api_key, we'll initialize the public room
subscribers to be whatever they were at the time the import happened.
Also, document the situation on the caveats section.
We had an inconsistent behavior when `LDAP_APPEND_DOMAIN` was set
in that we allowed user to enter username instead of his email in
the auth form but later the workflow failed due to a small bug.
Fixes: #10917.
This now sets the user-agent to something like:
ZulipOutgoingWebhook/2.0
(It uses the current ZULIP_VERSION.)
Before this change, the user-agent would be
something like `python-requests/2.18.4`.
Fixes#10741
Closes#11195. We add a management command to allow us to send emails
to the email mirror directly. The command doesn't require any
configuring of email sending or receiving for the email mirror,
it passes the emails directly using the process_message function.
We need to explicitly check for empty recipient lists in
send_message to ensure that internal_send_huddle_message doesn't
call Addressee.for_private with an empty recipient list.
The check for empty recipient lists (the "message_to" argument)
does not belong in check_message. As we implement support for
sending messages by user IDs (see #9474), we will be extending
much of the existing code in Addressee.legacy_build that validates
recipient lists. Therefore, Addressee.legacy_build is a much more
apt area to check for empty recipient lists.
Also, Addressee.for_private and Addressee.for_user_ids also need
to do their own validation, since not everything goes through
Addressee.legacy_build. It is okay to simply throw a 500 in these
cases because we expect that callers will be doing their own
validation for calls that don't go through Addressee.legacy_build().
This commit is a part of our efforts surrounding #9474.
Feature of sending notification to the stream using notification bot
is added. user_profile is also passed to do_rename_stream for using
the name of user who renamed the stream in notification.
Notification is sent to the stream using
internal_send_stream_message in do_rename_stream.
Fixes#11034.
Previously, this wasn't an explicit feature of the export tool.
Note that the current version still includes metadata on private
streams and private message recipients, just not their messages.
This adds a proper template for the /digest page, making it a
reasonable way to view the digest email content for development and
debugging.
Fixes: #11016.
Previously, we had some hand-written logic for parsing the subject
line of the email's headers and turning it into a Python string using
each of the valid encodings for an email. That logic was buggy, and
sometimes resulted in a bytes object being passed into the
`send_zulip`, which would eventually throw an exception.
The fix for this is to use the Python standard library make_header
method for handling internationalized email headers.
https://stackoverflow.com/questions/7331351/python-email-header-decoding-utf-8
Fixes part 1 of #10612. We use a regex to remove RE:, FWD: (and similar
variations) from email subjects. Unit test is included, we add
subjects.json in fixtures containing various subjects to try the
stripping on.
Since we have already added the `invite_as` field to models, we can now
replace usage of `invite_as_admin` properly with its equivalent `invite_as
== PreregistrationUser.INVITE_AS['REALM_ADMIN']`.
Hence, also removed now redundant `invite_as`.
This should make it easily for mobile/terminal apps to handle
situations like the user's API key changing.
Also fix the fact we were incorrectly using a 400, not 401, status
code for this case.
The logic for flushing the API key has been broken every since we
added the cache, since we were incorrectly flushing the new API key,
not the old API key, from the cache after regeneration.
This should eliminate the need to do manual analytics work when
importing organizations imported/exported using the zulip -> zulip
import/export tools.
The slim_mode setting had been incorrectly configured to skip
"deleted" users, resulting in bugs where private messages with deleted
users would not be imported.
This endpoint serves requests which might originate from an image
preview link which had an http url and the message holding the image
link was rendered before we introduced thumbnailing. In that case
we would have used a camo proxy to proxy http content over https and
avoid mix content warnings.
In near future, we plan to drop use of camo and just rely on thumbor
to serve such images. This endpoint helps maintain backward
compatibility for links which were already rendered.
This setting splits away part of responsibility from THUMBOR_URL.
Now on, this setting will be responsible for controlling whether
we thumbnail images or not by asking bugdown to render image links
to hit our /thumbnail endpoint. This is irrespective of what
THUMBOR_URL is set to though ideally THUMBOR_URL should be set
to point to a running thumbor instance.
We used to add sharpen filter for all the image sizes whereas it was
intended for resized images only which would have been smoothened
out a bit by the resize operation.
This unnecessary use of the filter used to result in weird issues
with full size images.
For example: Image located at this url:-
http://arqex.com/wp-content/uploads/2015/02/trees.png
When rendered in full size would have just boundaries visible.
When trying to find the email gateway address, use the
`email.util.getaddresses` function to deal with cases
where multiple recipients are included in the email header
or the stream address appears as an angle-addr with a
name given (e.g. if someone added it to their address book).
Added some other headers where the required address may
appear: "Resent" headers are sometimes used for forwarding,
and streams may also be found in CC. There is no way to find
the address if the email was recieved as a BCC.
This should hopefully be the last commit of this form; ultimately, my
hope is that we'll be able to refactor the semi-duplicated logic in
this file to avoid so much effort going into keeping this correct.
For many actions, we make a single call to
send_event, and it's kind of heavy now to
properly assert we made one call, and we
don't need to exercise all the tornado code
to prove that the action was written correctly.
This makes it possible it include our standard markdown formatting in
one's custom profile fields, allowing for links, emphasis, emoji, etc.
Fixes#10131.
While we're at it, we remove the JSON parsing that was part of the
user field code path, since this function isn't responsible for
rendering user fields.
This adds a model field `invited_as` to PreregistrationUser model class
which will replace `invited_as_admin` in future. Intentions behind adding
this are that we can specify "types" of users for an invited person other
than admin or regular that is, guest user or maybe many others in the
future.
The octet-stream content type is potentially under-specified, but it's
better than potentially submitting None and increases consistency of
this part of the codebase.
The boto library's s3 interface allows setting only string-format
metadata keys. So we need to cast the last_modified floating-point
timestamp into a string before storing on the S3 object.
This bug mostly broke uploading avatars when using the S3 storage backend.
Apparently, hc-migrate can generate emoticons.json files with a
somewhat different format. Assuming that other files are in the
normal format, we should be able to handle it like this.
See report in #11135.
Apparently, some methods of exporting from HipChat do not include an
emoticons.json file. We could test for this using the
`include_emoticons` field in `metadata.json`, but we currently don't
even bother to read that file. Rather than changing that, we just
print a warning and proceed. This is arguably better anyway, in that
often not having emoticons.json is the result of user error when
exporting, and it's nice to flag that this is happening.
Fixes#11135.
This IntegrityError has been happening occasionally in production due
to races, likely due to some sort of mobile app double-post bug.
Handle this by avoiding a 500, and returning the same 400 we would do
if there hadn't been a race.
This section is largely unnecessary, doesn't convey any useful
information, and is probably a remnant from an older version of
this doc that we forgot to remove.
This commit adds a custom Markdown include extension which is
identical to the original except when a macro file can't
be found, it raises a custom JsonableError exception, which
we can catch and then trigger an appropriate test failure.
Fixes: #10947
Our HipChat conversion tool didn't properly handle basic avatar
images, resulting in only the medium-size avatar images being imported
properly. This fixes that bug by asking the import tool to do the
thumbnailing for the basic avatar image (from the .original file) as
well as the medium avatar image.
This is a major rewrite of the billing system. It moves subscription
information off of stripe Subscriptions and into a local CustomerPlan
table.
To keep this manageable, it leaves several things unimplemented
(downgrading, etc), and a variety of other TODOs in the code. There are also
some known regressions, e.g. error-handling on /upgrade is broken.