301 Errors
Chuck Neal
chuck at mediamacros.com
Tue Oct 19 15:45:31 CDT 2004
Basically I have a relational database with multiple tables joined together.
Here is a simple example...
Box - ID, Name
Item - ID, Name
Item_box - Box_id, Item_id
The ids are created in order they are added, so when I import/export I have
to be aware that the ids must change so...
First I go through the boxes. I create a new database (empty) and get all
the current boxes from the original database. I then add each and set up a
list of the old ID and new ID....
This way I know what the old and new numbers are. I do the same for item,
then I rebuild the item_box table by checking the old numbers against the
new and create new entries.
I am adding images and text, but all of these were added via the same
interface. It seems that it may be getting overloaded when doing this in
one loop though. Is there any way to force the cache to empty out? I tried
flush() at key points but it had no effect. No single item is more than a
few KB, and if I do a smaller subset of the data it exports fine.
I tried upping the cache from 4 MB to 8 MB and it still errors. The same
commands that were used to create the initial data are also being used to
copy the DB. I have a parent script that is the DB controller, so I just
spawn a second one and send add() commands to this so it is not building any
differently than the base application, except that its doing them quickly in
a loop.
Any other ideas? These are testes so clients may have much larger data sets
they want to export so I can't even count on a small size being enough to
get around the problem. Why is it always corrupting at the same point? Its
all plain text and image data and it comes from existing records so it
should be formatted correctly.
-Chuck
--------------------------
Chuck Neal
CEO, MediaMacros, Inc.
chuck at mediamacros.com
http://www.mediamacros.com
--------------------------
Check out the Developers Mall
Your one stop shop for all your Director Xtra Needs
http://www.mediamacros.net/customer
-----Original Message-----
From: valentina-bounces at lists.macserve.net
[mailto:valentina-bounces at lists.macserve.net] On Behalf Of Ruslan Zasukhin
Sent: Tuesday, October 19, 2004 1:54 PM
To: valentina at lists.macserve.net
Subject: Re: 301 Errors
On 10/19/04 8:12 PM, "Chuck Neal" <chuck at mediamacros.com> wrote:
> Well the problem is that this is a new database. I create the new
> database and add the paired data to it but reading records, getting
> the info via a list, then writing them to the new database. This
> works fine for smaller chunks of data but the larger ones seem to
> corrupt it every single time. The original database is not effected,
> its just the new one that seems to give me problems. What causes this
> in the first place? Is there a command, or common event that makes
> this happen that I should avoid?
There is no such info.
301 -- this is something very deep.
> I was able to export a small group just now with no problems, yet
> others fail every single time with a 301 error. I can't always
> predict the size or content of the data the user will have so how can
> I safeguard against this happening at runtime?
So you do in loop read of record and you self produce some export to file?
If comment export and leave only iteration by records then problem still
here?
What size of you cache ?
Make sure it is big enough
--
Best regards,
Ruslan Zasukhin [ I feel the need...the need for speed ]
-------------------------------------------------------------
e-mail: ruslan at paradigmasoft.com
web: http://www.paradigmasoft.com
To subscribe to the Valentina mail list go to:
http://lists.macserve.net/mailman/listinfo/valentina
-------------------------------------------------------------
_______________________________________________
Valentina mailing list
Valentina at lists.macserve.net
http://lists.macserve.net/mailman/listinfo/valentina
More information about the Valentina
mailing list