How to save about 500GB of data in a database?

I want to save the bitcoin data and build a Bitcoin Indexer. While saving the data, the write speed becomes slower more and more. The size of all data is about 500GB and when it arrives to about 20GB, the speed is extremely slow. Notice that I have 4 tables and each table has some indexes and I’ve tried both MongoDB and MySQL. What would be a proper way for this?

Changes to server configuration option remote access are not supported in SQL Database Managed Instances

Having just set up our new SQL Server Managed Instance, restored a sample database for testing, and run Azure’s vulnerability assessment, it produces this high risk finding:

VA2120 – Features that may affect security should be disabled

The more SQL Server features and services you enable, the larger its surface attack area becomes, making your system more vulnerable to potential attacks. These fetures should be disabled unless it is absolutely needed in this environment.

Remediation Script:

EXECUTE sp_configure 'show advanced options', 1; RECONFIGURE WITH OVERRIDE; EXECUTE sp_configure 'remote access', 0; RECONFIGURE; EXECUTE sp_configure 'show advanced options', 0; RECONFIGURE; 

Turning to Google before doing anything, I found this Microsoft Docs article stating that (emphasis mine):

This topic is about the "Remote Access" feature. This configuration option is an obscure SQL Server to SQL Server communication feature that is deprecated, and you probably shouldn’t be using it.

Can anyone therefore please provide some clarity on the following?

  1. Why is it enabled given Microsoft’s description?
  2. Does it need to be enabled in Azure SQLMI? Because…
  3. When I run the remediation script I get this error:

Changes to server configuration option remote access are not supported in SQL Database Managed Instances.

How to add a volume to current in use MySQL database

related to the problem I faced here No space left on device, I want to adopt a long term solution about the point no space left on device. One of the action includes to add a volume to the machine in which the database runs (MySQL 8, Ubuntu 20.04 and Digital Ocean provider).

I would like to know which one is safest way and rules to follow in order to avoid loosing reference and/or data when I will add a volume on the machine. Considering that this database is running in production without any replication.

Thank you in advance for the support

Best Database to be shipped with my application?

I have a .net core application using a database. I need to create an installer using nsis where I will be packaging my application along with a database , so that the client can easily install my application and database along with all its dependencies using a simple wizard.

I want a suggestion regarding the database.

Requirements:

  1. Easy to install, it must be lite weight , as much less dependencies as possible, have zip binaries to install, and error free during installation.
  2. Database should be able to handle large no of records and remote connections.

what I have tried:

  • MSSQL Server: no binary file option, has large size and has so many dependencies.
  • SQLite: it’s a file based, no remote connections possible
  • PostgreSQL: it was a perfect choice, but it has many installation issues and bugs, even the official installer is failed to install on some of the machines.

How to migrate a SQL Server Erwin Mart to database to Aurora (Amazon RDS)

I want to migrate a SQL Server Erwin data Mart database to Aurora and trying to figure out what the easiest/quickest way to do that is.

Options to me seem to be:

  1. Saving models to the file system, repoint the application to the new mart database, then loading from the file system to the new database. https://support.erwin.com/hc/en-us/articles/360003443452-Java-scripts-that-automatically-save-a-mart-s-models-offline-to-a-drive [support.erwin.com] https://support.erwin.com/hc/en-us/articles/115002674131-ERWIN-DATA-MODELER-MART-API-RESOURCE-PAGE [support.erwin.com]

Has anyone got any experience using these apis?

  1. Export/Import. Mysql migration tool. https://www.mysql.com/products/workbench/migrate/
    Amazon migration tool Does anyone know if the schema is the same, can I simply export/import the data?

Database connection lost whenever SQL query string is too long

I recently switched from running my Rails app on a single VM to running the database — MariaDB 10.3 — on a separate (Debian Buster) VM. Now that the database is on a separate server, Rails immediately throws Mysql2::Error::ConnectionError: MySQL server has gone away whenever it tries to make a query where the SQL itself is very long. (They query itself isn’t necessarily one that would put significant load on the system.)

An example query that causes the problem looks like this:

SELECT `articles`.`id` FROM `articles` WHERE `articles`.`namespace` = 0 AND `articles`.`wiki_id` = 1 AND `articles`.`title` IN ('Abortion', 'American_Civil_Liberties_Union', 'Auschwitz_concentration_camp', 'Agent_Orange', 'Ahimsa') 

… except the array of titles is about 5000 items long, and the full query string is ~158kB.

On the database side, this corresponds to warnings like this:

2021-03-25 15:47:13 10 [Warning] Aborted connection 10 to db: 'dashboard' user: 'outreachdashboard' host: 'programs-and-events-dashboard.globaleducation.eqiad1.wikimed' (Got an error reading communication packets)

The problem seems to be with the network layer, but I can’t get to the bottom of it. I’ve tried adjusting many MariaDB config variables (max_allowed_packet, innodb_log_buffer_size, innodb_log_file_size, innodb_buffer_pool_size) but none of those made a difference. The problem seems to be that the connection is aborted while it is attempting to transmit the long SQL query string from the app server to the database server. (There’s no corresponding problem with receiving large query results from the database.)

I’ve tried adjusting several timeout-related settings as well, although that seems unlikely to be the problem because I can replicate the connection error without any significant wait, just by issuing one of the long-SQL-string queries from a Rails console.

I’ve tried using tcpdump to see what’s coming in, but didn’t pick up any additional clues from that.