On 13/09/2021 23:07, ourdiaspora wrote:
Readers,
Please could someone help to explain why the following code fails:
<?php
$uploaddir = '~/tmp/cpac/cpactmp/';
$uploadfile = $uploaddir . basename($_FILES['userfile']['name']);
$dbconnect = pg_connect("dbname=cpacweb user=cpaca");
$targetfilepath = $uploaddir . $uploadfile;
if(isset($_POST["submit"])){
$csvfile=$_FILES["userfile"]["tmp_name"];
if($_FILES["userfile"]["tmp_name"] > 0) {
$csvfileopen=fopen($csvfile, "r");
while(($csvfiledata=fgetcsv($csvfileopen, 10000, ","))){
$csvinsert = "INSERT into somedata(name, id, emailaddress, groupname) VALUES ('".$csvfiledata[0]."', '".$csvfiledata[1]."', '".$csvfiledata[2]."', '".$csvfiledata[3]."')";
}
}
fclose($csvfileopen);
}
$query = "SELECT * FROM somedata";
$result = pg_exec($dbconnect, $query);
echo "Number of rows: " . pg_numrows($result);
pg_freeresult($result);
pg_close($dbconnect);
?>
The trial csv file contains 2 rows of data, but the query result is zero.
It looks like you're creating the insert query string, but you're never
doing anything with it, so the database _will_ be empty!
If I may make some suggestions:
You're checking to see if $_POST["submit"] is set, but you're not
actually using anything from the POST data, you're directly using
$_FILES instead, so it might be better checking that. Also, when you do
end up executing the query, I'd avoid doing that from within a loop a
single insert at a time, as that may be very slow. It's generally faster
to do bulk inserts with your database. Finally, you're creating the
insert query using standard string concatenation, with no sanitisation
against bad input. Consider using PDO, which should automatically
protect against SQL injection. It does also appear to have a method
pgsqlCopyFromFile which might do what you're doing here.
--
Ashley Sheridan
https://www.ashleysheridan.co.uk