The naive solution #23465 creates situations where the same user can have
multiple reactions as the base emojis are not unique, e.g. +1::skin2
and +1::skin4 would both reduce to +1 but the userlists are separate.
This solution handles the reduction, merges the same-base reactions,
and deduplicates the userlist.
Co-authored-by: Alex Vandiver <alexmv@zulip.com>
Co-authored-by: rht <rhtbot@protonmail.com>
Previously, emoji.json was read from
"$ZULIP_PATH/node_modules/emoji-datasource-google/emoji.json".
This path doesn't exist in production when installing from scratch from
a release tarball. And so, we ensure emoji.json exists by copying it to
`static/generated/emoji`.
With tweaks to comments by tabbott.
Fixes: #23469
This commit adds the OPTIONAL .realm attribute to Message
(and ArchivedMessage), with the server changes for making new Messages
have this set. Old Messages still have to be migrated to backfill this,
before it can be non-nullable.
Appropriate test changes to correctly set .realm for Messages the tests
manually create are included here as well.
build_message has a lot of arguments, so it's hard to verify correctness
of callers that just try to get the order right. It's much clearer to be
explicit via kwargs. mattermost.py and rocketchat.py already do this, so
let's bring slack.py and gitter.py up to par.
Because Slack emoji naming is different from Zulip's.
According to https://emojipedia.org/slack/, Slack's emoji shortcodes are
derived from https://github.com/iamcal/emoji-data.
There are probably some deviations from that dataset, but this PR should
at least catch the ones that are identical to iamcal's.
If there are more than 1 room with the same set of users, the import
will fail due to a unique constraint on the huddle_hash. Figuring out
why and which room is causing this database error is kinda difficult.
We deduplicate those cases here and simply merge the rooms together.
Note however, that the deduplication does not work as expected so we
simply ignore them all together for now and only raise an exception
along some logging output. At least this way, it is pretty clear what is
wrong and you do not have to wait to get a database error during the
actual import.
We also ignore empty huddle rooms since those are the duplicates that
caused problems for me and if they are empty, ignoring them is easier
than trying to get the merge to work.
Not sure where those channels come from since we discovered this with
production data.
Signed-off-by: Florian Pritz <bluewind@xinu.at>
Not sure where those come from since we discovered this with production
data. Somehow there were reactions with usernames that were old and no
longer existed.
Signed-off-by: Florian Pritz <bluewind@xinu.at>
Not sure where those come from since we discovered this with production
data.
There only was a single instance of this in my entire batch of data in
an old message from the time when we started using Rocket.Chat. This
might be an old issue or it might require some special settings that
were later changed.
Signed-off-by: Florian Pritz <bluewind@xinu.at>
Users will only be able to login via GitHub, because imported users
get GitHub's generated noreply email addresses - so this should be the
only auth method enabled at first, to avoid confusion.
Only ["id"] is accessed on the dicts (representing the external tool
users). Given that for some tools the id may be under a different name
etc. due to different user dicts format, it's best to just pass those
ids to the function so that it can stay generalized and not reliant
on a specific user dict format.
get_timestamp_from_message was extracted in the previous commit. We can
deduplicate and the code a bit cleaner by using it where appropriate
instead of message["ts"].
message["ts"] is slack-specific. For this to be a general util function
it needs to take a callable that will grab a timestamp from the message
dict (which has varying formats depending on what we're importing from).
It is apparently possible to have a mention of a user who is not (or
no longer?) in the `users.bson` table.
Skip such mention for the purposes of Zulip import; there's nothing
better for us to do.
This is likely an error somewhere in rocketchat's MongoDB "eventual
consistency," but there is no problem with skipping the chunks at this
step.
In the one case where this was observed so far, the upload-id was not
referenced in any message -- if it is referenced and has chunks, but
has no metadata, we will fail later, at that reference.
We construct model instances in the import tool solely for the purpose
of serializing them with the `model_to_dict` helper that returns a
dictionary. Passing `float` to these models' DateTimeField is not
accepted by the type checker. Modifying the dictionary instead avoids
this typing issue.
Signed-off-by: Zixuan James Li <p359101898@gmail.com>
Now that we can assume Python 3.6+, we can use the
email.headerregistry module to replace hacky manual email address
parsing.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
Add none-checks, rename variables (to avoid redefinition of
the same variable with different types error), add necessary
type annotations.
This is a part of #18777.
Signed-off-by: Zixuan James Li <359101898@qq.com>
history_public_to_subscribers wasn't explicitly set when creating
streams via build_stream, thus relying on the model's default of False.
This lead to public streams being created with that value set to False,
which doesn't make sense.
We can solve this by inferring the correct value based on invite_only in
the build_stream funtion itself - rather than needing to add a flag
argument to it.
This commit also includes a migration to fix public stream with the
wrong history_public_to_subscribers value.
Fixes#21784.
4815f6e28b tried to de-duplicate bot
email addresses, but instead caused duplicates to crash:
```
Traceback (most recent call last):
File "./manage.py", line 157, in <module>
execute_from_command_line(sys.argv)
File "./manage.py", line 122, in execute_from_command_line
utility.execute()
File "/srv/zulip-venv-cache/56ac6adf406011a100282dd526d03537be84d23e/zulip-py3-venv/lib/python3.8/site-packages/django/core/management/__init__.py", line 413, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/srv/zulip-venv-cache/56ac6adf406011a100282dd526d03537be84d23e/zulip-py3-venv/lib/python3.8/site-packages/django/core/management/base.py", line 354, in run_from_argv
self.execute(*args, **cmd_options)
File "/srv/zulip-venv-cache/56ac6adf406011a100282dd526d03537be84d23e/zulip-py3-venv/lib/python3.8/site-packages/django/core/management/base.py", line 398, in execute
output = self.handle(*args, **options)
File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/management/commands/convert_slack_data.py", line 59, in handle
do_convert_data(path, output_dir, token, threads=num_threads)
File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 1320, in do_convert_data
) = slack_workspace_to_realm(
File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 141, in slack_workspace_to_realm
) = users_to_zerver_userprofile(slack_data_dir, user_list, realm_id, int(NOW), domain_name)
File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 248, in users_to_zerver_userprofile
email = get_user_email(user, domain_name)
File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 406, in get_user_email
return SlackBotEmail.get_email(user["profile"], domain_name)
File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 85, in get_email
email_prefix += cls.duplicate_email_count[email]
TypeError: can only concatenate str (not "int") to str
```
Fix the stringification, make it case-insensitive, append with a dash
for readability, and add tests for all of the above.
This resolves the issues reported in #20108, major chunk of which were
due to the incomplete support for importing the livechat streams/messages
in the tool. So, it's best not to import any livechat streams/messages for
now until a complete support for importing the same is developed.