BigW Consortium Gitlab
Instead of updating a fixed number of rows (based on the amount of rows available at the start of the update) the method "update_column_in_batches" will now continue updating rows until it runs out of rows to process. For a table with a high rate of inserts this may result in the migration taking quite some time. However, the alternative is not all rows being updated or the "change_column_null" method raising an error due to there being NULL values.
Name |
Last commit
|
Last update |
---|---|---|
.. | ||
api | Loading commit data... | |
assets | Loading commit data... | |
backup | Loading commit data... | |
banzai | Loading commit data... | |
ci | Loading commit data... | |
container_registry | Loading commit data... | |
gitlab | Loading commit data... | |
json_web_token | Loading commit data... | |
omni_auth | Loading commit data... | |
rouge/formatters | Loading commit data... | |
support | Loading commit data... | |
tasks | Loading commit data... | |
banzai.rb | Loading commit data... | |
disable_email_interceptor.rb | Loading commit data... | |
event_filter.rb | Loading commit data... | |
extracts_path.rb | Loading commit data... | |
file_size_validator.rb | Loading commit data... | |
file_streamer.rb | Loading commit data... | |
gitlab.rb | Loading commit data... | |
gt_one_coercion.rb | Loading commit data... | |
repository_cache.rb | Loading commit data... | |
static_model.rb | Loading commit data... | |
unfold_form.rb | Loading commit data... | |
uploaded_file.rb | Loading commit data... | |
version_check.rb | Loading commit data... |