Fixes#7665
In case of invitation events, 'invites_changed' event without
any real payload is sent to all the realm admins and the user.
The event is handled by reloading the list to view recent changes.
Commit tweaked by shubhamdhama:
* Send an `invite_changed` event when an user accept an invite.
Also, added the test for the same.
* No need to delete the invite list in frontend, current logic
handles the case when the invite data is changed properly.
* Extracted the common logic for sending an event into
`notify_invites_changed`.
This is all the plumbing that makes it possible to enable the
stream_email_notifications setting via the Zulip API. The flag still
doesn't do anything yet, but this is a nice checkpoint along the way
to implementing this feature.
This commit creates a new field called delivery_email. For now, it is
exactly the same as email upon user profile creation and should stay
that way even when email is changed, and is used only for sending
outgoing email from Zulip.
The purpose of this field is to support an upcoming option where the
existing `email` field in Zulip becomes effectively the user's
"display email" address, as part of making it possible for users
actual email addresses (that can receive email, stored in the
delivery_email field) to not be available to other non-administrator
users in the organization.
Because the `email` field is used in numerous places in display code,
in the API, and in database queries, the shortest path to implementing
this "private email" feature is to keep "email" as-is in those parts
of the codebase, and just set the existing "email" ("display email")
model field to be something generated like
"username@zulip.example.com" for display purposes.
Eventually, we'll want to do further refactoring, either in the form
of having both `display_email` and `delivery_email` as fields, or
renaming "email" to "username".
We extract out the logic for generating a list of all historical
topics for a given stream as a separate function. This avoids code
duplication when we add the similar code path for grabbing all topics
for web public streams.
The main remaining todo for correctly populating
RealmAuditLog.requires_billing_update is supporting the de-seating (and
corresponding re-seating) that happens after being offline for two weeks.
The only changes visible at the AST level, checked using
https://github.com/asottile/astpretty, are
zerver/lib/test_fixtures.py:
'\x1b\\[(1|0)m' ↦ '\\x1b\\[(1|0)m'
'\\[[X| ]\\] (\\d+_.+)\n' ↦ '\\[[X| ]\\] (\\d+_.+)\\n'
which is fine because re treats '\\x1b' and '\\n' the same way as
'\x1b' and '\n'.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
An estimated traffic of 0 suggests a stream is dead, and has pretty
different semantics from any non-zero value. So we should round up any
number between 0 and 1 to 1.
We don't ever use this value, but it's confusing to have the incorrect
calculation in the code.
Ideally we would set this to "None", but I don't know the code well enough
to be confident nothing would break.
We've for a long time had the behavior that a bot mentioned in a
stream message receives the notification, regardless of whether the
bot was actually subscribed to the stream.
Apparently, this behavior also triggered if you mentioned a bot in a
private message (i.e. the bot would be delievered the private message
and would probably respond unhelpfully in a new group private message
thread with the PMs original recipients plus the bot).
The fix for this bug is simple: To exclude this feature for private
messages.
This fixes an issue where if you make #announce (the default
announcement stream) announce-only, then creating a new stream will
throw an exception (because notification-bot can't send there).
Fixes#9636.
The user can now specify the value while creating a stream.
An admin can later change it via `Change stream permissions`
modal. Add is_announcement_only to subscription type text.
For some reason in my original version I was sending both
content and data to the client for submessage events,
where data === JSON.parse(content). There's no reason
to not just let the client parse it, since the client
already does it for data that comes on the original
message, and since we might eventually have non-JSON
payloads.
The server still continues to validate that the payload
is JSON, and the client will blueslip if the server
regressses and sends bad JSON for some reason.