Looking for help whit a small task of just changing some letters in a databasen that im trying to import.
A forum whit like 30k user and some threads, the export File is lika 1.7gb and its to big for my computer cpu to handel so i whas wondering if ther is a way to mabey gpu accelerat it?
I got a i7 7700k 4.9ghz and a GTX1080.
I whas thinking that 1.7bilion charter dont take like weeks?
Or do i have some thing wrong?
Itâs a relatively tiny database, itâll be fine on the CPU, if you do it correctly at least. Like disabling locking just for the import, altering how often commits happen etc, there are some tricks, depends on which database it is too.
And no, you can not accelerate it with your GPU. There are databases out there that can benefit from a GPU, Iâm pretty sure none of them are SQL-based. GPU is not a simple bolt-on acceleration, it does not work like that.
from phpmyadmin in sql format.
i have done that, but all i can find is a small âsearch and replaceâ for columbs and not for the tabel. sins i got user name and all the threds whit
Thereâs a basic linux command line tool called sed. You can use it for a âquick and dirtyâ regex replace operation on any text. Search the web for âsed substitutionsâ.
It shouldnât take more than a minute to go through the file.
If all you have is Windows, you can use it with WSL.
In order to not keep going through the file over and over, use the head -n 10000 my_dump.sql > sample.sql to get the first 10k lines as a sample to help you craft the commands.
(For future reference, you can and should be using utf8 character set in your create table statements, and as your character encoding for web pages youâre rendering, and from PHP you should be using the ICU library to sanitize/normalize any unicode text to NFC form at the point where your web app takes user input)
Thereâs also a command line tool called iconv, that can do some conversions, but itâs probably not worth getting into too much detail if you know exactly what substitutions you need to make.
You can also try dumping tables in CSV format, and then you could manipulate them in Python. Reading/processing a few gigabytes of data that way - simple replace shouldnât take more than 10-15 seconds. You can import CSV into MySQL afterwards⊠(this is also for future reference, use sed for now - itâs simpler).
You could also do the conversion with MySQL, using alter table and select into and or triggers. Figuring out how exactly will take you longer than running sed - would only be the right approach if you had more data and couldnât afford to go offline or go offline for long.