<div dir="ltr">Hi Jon,<div><br><div>1) How many columns are there in this file?</div></div><div>2) Do you create a new table yourself with all needed fields,</div><div>or you select an option to create it automatically?</div><div><br></div><div>My test file (9.5 Mb, 11 fields, 30,000 records) was imported into prepared table during a few seconds,</div><div>here's a video of the process:</div><div><a href="https://www.dropbox.com/s/1vel4mroshjuzam/csv_import.mp4?dl=0">https://www.dropbox.com/s/1vel4mroshjuzam/csv_import.mp4?dl=0</a><br></div><div><br></div><div>--</div><div>Best regards,</div><div>Sergey Pashkov</div></div><div class="gmail_extra"><br><div class="gmail_quote">On Fri, Oct 10, 2014 at 5:49 PM, Ruslan Zasukhin <span dir="ltr"><<a href="mailto:ruslan_zasukhin@valentina-db.com" target="_blank">ruslan_zasukhin@valentina-db.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On 10/10/14, 3:40 PM, "Jonathan Evans" <<a href="mailto:consultjonevans@gmail.com">consultjonevans@gmail.com</a>> wrote:<br>
<br>
Hi Jonathan,<br>
<br>
It seems its Vstudio itslf problem.<br>
Valentina DB via API command can do this of course fast.<br>
<br>
<br>
Just month ago was fix for SQLite in this area.<br>
<br>
Sergey, may be was inserted flush() into loop?<br>
Please check let me know in chat ...<br>
<span class="im HOEnZb"><br>
<br>
> Yes that's right.<br>
><br>
> On Fri, Oct 10, 2014 at 6:37 AM, Sergey Pashkov<br>
> <<a href="mailto:sergey_pashkov@valentina-db.com">sergey_pashkov@valentina-db.com</a>> wrote:<br>
> Hi Jon,<br>
> So you have tried to import CSV file into a new table in a local Valentina<br>
> Database and it was very slow?<br>
><br>
> --<br>
> Best regards,<br>
> Sergey Pashkov.<br>
><br>
> On Fri, Oct 10, 2014 at 3:19 PM, Jonathan Evans <<a href="mailto:consultjonevans@gmail.com">consultjonevans@gmail.com</a>><br>
> wrote:<br>
> Hi there<br>
> Have just used Valentina Studio's import from csv wizard to import a 9MB csv<br>
> file with 30,000 rows into a new table. It took about one hour.<br>
> This seems slow to me. Opening the same csv in Excel takes seconds. Importing<br>
> into MySQL is quick also.<br>
> 1) What are your experiences of importing medium-large csv files into<br>
> Valentina?<br>
> 2) Any tips for speeding up the process?<br>
> 3) My aspiration is to import much larger csv files with 1 million + rows. How<br>
> realistic is that?<br>
> Thanks for your help.<br>
<br>
</span><span class="HOEnZb"><font color="#888888">--<br>
Best regards,<br>
<br>
Ruslan Zasukhin<br>
VP Engineering and New Technology<br>
Paradigma Software, Inc<br>
<br>
Valentina - Joining Worlds of Information<br>
<a href="http://www.paradigmasoft.com" target="_blank">http://www.paradigmasoft.com</a><br>
<br>
[I feel the need: the need for speed]<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
<br>
_______________________________________________<br>
Valentina-studio mailing list<br>
<a href="mailto:Valentina-studio@lists.macserve.net">Valentina-studio@lists.macserve.net</a><br>
<a href="http://lists.macserve.net/mailman/listinfo/valentina-studio" target="_blank">http://lists.macserve.net/mailman/listinfo/valentina-studio</a><br>
<br>
</div></div></blockquote></div><br></div>