Is it possible to execute big SQL query in Hibernate?

Syntax error with simple SQL insert and backup

  • Hi, This is a two part request. Part 1: I have an Excel spreadsheet almost 44,000 rows long, but I have been unable to get it to put the information into my phpMyAdmin SQL type of database. When I use this script: LOAD DATA INFILE "C:/sample1.csv" INTO TABLE products FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY "\n"; I get this error: MySQL said: #1045 - Access denied for user: '[email protected]' (Using password: YES) (actual numbers changed here for security) However, when I use this proceedure I get the following syntax error. I use either a .csv file created by Excel save as or a file created in my editor. I've tried comma delimited and tab delimited files with careful attention to var types. At this point I use just a one row file because I'm trying to eliminate error possibilities, but I don't know what else to do at this point. The access denied error only comes up when using the above script, but when I use this procedure: double click the "products" link in the left hand margin to bring up the products table, click the "Query window" link in the left hand margin, then click the "import files" tab, then do "browse", click "autodetect", then click "go", then the syntax error comes up (but not the access denied error). I've circumvented the script "access error" with the procedure stated in this paragraph, but there seems to be some syntax error I can't figure out. Where could the syntax error be coming from? MySQL said: #1064 - You have an error in your SQL syntax near 'b2, c3, d4, e5, f6, h7, i8, j9' at line 1 I'm not a programmer, so you will need to tell me what information you will need to answer this question and how to get that information. So what I'm looking for is either a script that does not give an access error (and no syntax error when access is allowed) or an explanation of how I transfer my data without syntax errors using the click commands of phpMyAdmin. Part 2: There is another simple script I'd like to execute that will allow me to do a complete mirror copy for back-up of my web site to another url. Because I have a slow dial-up connection, and the products database is so large and growing, and there are so many web pages and a growing customer database, I'd prefer to simply do the copy-back-up from one url server to another automatically and at high-speed. To ftp the copy as a backup to my pc is not practical. Ideally I will just have 10 to 15 directories in my other url and each time I execute the backup it will copy it into the next consecutive directory. That way if I do it once a day, I will have 2 weeks worth of back-ups. If I don't catch a corruption for a couple of days, I still have a previous good back-up I can restore from. I have php and cgi-bin available to me at both sites. For a quick answer with working code, I will include a $25 tip. Thank you for your help in this. Steve

  • Answer:

    Below is the PHP script (I have also FTP'd it to my site: http://mike.dewolfe.bc.ca/scripts/buftp.php -- MS site, so it won't execute the PHP code). What this does is generates two archives using a system command, tar and GZIP-- the HTML directory (stored in $wwwd) and the MySQL database. It puts these files into a tmp directory. Then, it makes an FTP connection with your other site, creates a backup directory out of the year, month and day. It liberalized the permissions for that directory then uploads these two files and closes the connection. There is no HTML output. You need to look at the variables between lines 27 and 42 and change them to match your setup (FTP user name, MySQL user name, directory for the HTML, etc.) To get this to run at regular intervals, you need to set up a "cron job" on the main site. This is something your ISP may not allow. If they do, you can use this discussion to help you along: http://www.phparch.com/discuss/viewtopic.php?p=1659 Basically, get a terminal session going and type in: crontab -e This opens the cron job editor. If you want to run this daily, your cron entry should look something like this: 30 3 * * * php /your/site/called/from/the/root/buftp.php So, at 3:30 AM everyday, this would make PHP call your back-up PHP script. If you want to run this weekly, your cron entry should look something like this: 30 3 * * 5 php /your/site/called/from/the/root/buftp.php So, at 3:30 AM every Friday, this would make PHP call your back-up PHP script. Because of how this works, you need to call the "buftp.php" script with its full directory name. If makes good sense to run this script during a time you expect to be quiet, otherwise it will be competing to do a full site back up while serving your other visitors. Save this code below at "buftp.php" in a directory where you can execute PHP. <?php //////////////////////////////////////////////// /// // // backup and FTP DB // // by Mike DeWolfe // // // // Requires: // // PHP 4.1.0+ with zlib support // // MySQL 3.22 (or higher) // // // // This will post to any FTP server // // // //////////////////////////////////////////////// // Changeable Variables if ($_SERVER['DOCUMENT_ROOT']) { $docroot = $_SERVER['DOCUMENT_ROOT']; } else { $docroot = "."; } // Directory stuff $arcd = $docroot."/tmp/"; $arcf = "website.tar"; $arcm = "mysql.gz"; $wwwd = $docroot."/web/"; // Database stuff $host = "localhost"; $user = " "; // The MySQL Username $pass = " "; // The MySQL password $db = " "; // The database's name // FTP stuff $ftpsite = "your ftp site"; $ftpuser = "your ftp username"; $ftppass = "your ftp password"; $ftpdir = "where the subdirs go"; // get the files and zip them // - first, delete the old file: unlink($arcd.$arcf); // - find them and zip them into one location // - use tar first $wdump = sprintf( 'tar --create --directory=%s --file=%s%s', $wwwd, $arcd, $arcf, ); system($wdump); // - then gzip the tar $wdump = sprintf( 'gzip -9 %s%s', $arcd, $arcf ); system($wdump); // get the database and zip it // - first, delete the old file: unlink($arcd.$arcm); // - zip them into one location near the files $mdump = sprintf( 'mysqldump --opt -h %s -u %s -p%s %s | gzip > %s%s', $host, $user, $pass, $db, $arcd, $arcm ); system($mdump); // open an FTP connection // make a date relative directory // upload the file archive and the database archive $ftpconn = ftp_connect($ftpsite); $login_result = ftp_login ($ftpconn, $ftpuser, $ftppass); if ($ftpconn && $login_result) { // away we go ftp_pasv ($ftpconn, true); // set up the new directory $today = getdate(); ftp_chdir($ftpconn,$ftpdir); $path = $today[year].$today[mon].$today[mday]; ftp_mkdir($ftpconn,$path); $chmod_cmd="CHMOD 0777 ".$path; $chmod=ftp_site($ftpconn, $chmod_cmd); ftp_chdir($ftpconn,$path); // upload the mysql and webdir archives $upload = ftp_put($ftpconn, $ftpdir."/".$path."/".$arcf.".gz", $arcd.$arcf.".gz", FTP_BINARY); $upload = ftp_put($ftpconn, $ftpdir."/".$path."/".$arcm, $arcd.$arcm, FTP_BINARY); } // whether successful or failed... ftp_quit($ftpconn); ?>

stevep234-ga at Google Answers Visit the source

Was this solution helpful to you?

Related Q & A:

Just Added Q & A:

Find solution

For every problem there is a solution! Proved by Solucija.

  • Got an issue and looking for advice?

  • Ask Solucija to search every corner of the Web for help.

  • Get workable solutions and helpful tips in a moment.

Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.