SoftTree Technologies SoftTree Technologies
Technical Support Forums
RegisterSearchFAQMemberlistUsergroupsLog in
sorry to trouble u again

 
Reply to topic    SoftTree Technologies Forum Index » 24x7 Scheduler, Event Server, Automation Suite View previous topic
View next topic
sorry to trouble u again
Author Message
liangzhu



Joined: 30 Nov 2011
Posts: 42
Country: Singapore

Post sorry to trouble u again Reply with quote
i've noticed that if i used jal to read a .txt file, i cannot open the same file to see the contents as jal is reading from it.

(otoh, i think if i used jal to write to a .txt file, i can actually open the same file to see the contents as jal is writing to it.)

is there any way to read a text file without the exclusive lock?
Mon Feb 20, 2012 5:15 am View user's profile Send private message
SysOp
Site Admin


Joined: 26 Nov 2006
Posts: 7838

Post Reply with quote
Please provide your script for reading files. There is something wrong with it.
Mon Feb 20, 2012 9:50 am View user's profile Send private message
liangzhu



Joined: 30 Nov 2011
Posts: 42
Country: Singapore

Post problems i've run into Reply with quote
actually i've run into an issue* where the script complains that it cannot open file. we checked at which line it could not open file, and see that its the scp2as400.log.
because the schedules are in a queue, nothing else should be accessing the file.
the only other process we can think of that logs to that file seems to be the scp batch file.

also, while we run the schedules, we cannot open the daily-hhmm.txt with notepad to see the contents.

generally, i get the idea that when jal needs to read a file, it has to lock it. although actually our script is actually trying to write to it in the issue i first described*.

i'm in the process of putting in onerrorgoto into these to make them go into the retries if i get script errors.

and i plan to make the scp batch file log to something else, so that i dont have 2 programs writing in turn to the same log, and hopefully avoid the locked log file issue.

sorry, i been using the replace function on the names of stuff..


Last edited by liangzhu on Tue Feb 21, 2012 1:19 am; edited 1 time in total
Mon Feb 20, 2012 11:39 pm View user's profile Send private message
liangzhu



Joined: 30 Nov 2011
Posts: 42
Country: Singapore

Post this too , says it cannot open file Reply with quote
in fact, with this script, a job that runs at 6am, which the input file is dailyrpt-0600.txt, has the same problem that the text (timestamp-reports.txt ) file cannot be opened(at various points, if i rerun the job).
the jobs that run this script are in a separate queue from the jobs that run the others in the previous post.
the source file list files are different, the log files are different, the retry list files are different.
but vaguely similiar problem.
says it cannot open file, and the file apparently is the one it has already opened and closed moments ago. it does not fail at any 1 particular line, it seems to happen randomly.
again we wonder if its because the scp batch file is was just writing to it?


Last edited by liangzhu on Tue Feb 21, 2012 1:18 am; edited 1 time in total
Mon Feb 20, 2012 11:57 pm View user's profile Send private message
SysOp
Site Admin


Joined: 26 Nov 2006
Posts: 7838

Post Reply with quote
Wow, you've got quite long and sophisticated script. Does the error message say which file cannot be opened?

I see that you are opening files for writing. In "write' mode files are locked to other process until the operation is complete and file is closed.

Since you are writing to text files, mostly a few lines at a time, you can use the following trick to speed up the process and eliminate file locks.

Prepare text to write before opening the file, for example,
Code:
Dim someText, string
ConcatEx( "line1", '\r\n",  "line2", '\r\n",  "line3", '\r\n", someText )
FileOpen (sfilelist, "streammode", "write", true, nFile)
FileWrite(nFile, someText)
fileclose(nFile)


As you can see all text manipulations are done before it goes to the file, and the file writing is done in 1 go.

Hope that helps
Tue Feb 21, 2012 12:30 am View user's profile Send private message
liangzhu



Joined: 30 Nov 2011
Posts: 42
Country: Singapore

Post actually i'm mixing up issues, sorry Reply with quote
fileopen( sCurrFilelist, "LineMode", "Read", false, nFileCount)
//skip the first 2 lines
// 01/09/2012
// store the whoever path defined in the first line
fileread (nFileCount,stemp)
Replace (stemp, 1, 3, "", stemp)
set scppaths, stemp
fileread (nFileCount,stemp)

here already i cannot open the text file(sCurrFilelist) to read it with notepad when the script is running.
Tue Feb 21, 2012 1:17 am View user's profile Send private message
liangzhu



Joined: 30 Nov 2011
Posts: 42
Country: Singapore

Post separate issue is this one Reply with quote
this is maybe a separate issue, which is there could be a 2nd separate process that logs to the same file the jal script is writing to.

in particular, when i produce a batch file this way:

Code:
today(adate)
format adate, "mmddyyyy", sdate
now(atime)
concat ("E:\\SFTP\\whoever\SCRIPT\\Get_files\\wherever\\Log\\", sdate, slog)
concat (slog, "reports.txt", slog)
fileexists slog , blog
ifthen blog, goloop
filesave slog, ""


.
.
.

Code:
concat scpcmd, " >> ", scpcmd
concat scpcmd, slog, scpcmd
concat scpcmd, " 2>&1", scpcmd
filesave ("E:\\SFTP\\whoever\\SCRIPT\\Get_files\\wherever\\scp.bat", scpcmd)
runandwait ("cmd /C E:\\SFTP\\whoever\\SCRIPT\\Get_files\\wherever\\scp.bat", "", "0", nscpid)
set scpcmd, ""


the batch file is logging to "slog".

then i make "slog" more readable by doing things like

Code:
sourceok:
fileopen (slog, "linemode", "write", true, nlog)
filewrite nlog, ""
now (atime)
filewrite nlog, atime
filewrite nlog, "successfully downloaded"
filewrite nlog, sdistribute
filewrite nlog, ""
fileclose nlog


and there was a job that complained that it could not open slog.
the thing loops over and over again, first it runs the batch file to download, then it does other things, all along the way, it updates slog with the time and what it is doing.
when it died, i re ran the job, then it stopped at some other line that it cannot open file, which was again slog.
Tue Feb 21, 2012 1:48 am View user's profile Send private message
SysOp
Site Admin


Joined: 26 Nov 2006
Posts: 7838

Post Reply with quote
A couple of additional suggestions, please verify all these jobs are running detached. In case if a job fails in a middle of file processing, in detached mode on job exit the system will try to close all open files automatically. Consider developing a user defined statement like myFileWrite taking 2 parameters, log file name and text to write and doing something like


Code:
Dim try_count, number, 3
Dim nlog, number
Dim error, string

WRITE:
// try file open and write
OnErrorGoTo TRY_AGAIN
Subtract(try_ciount, 1)
FileOpen( file_name, "linemode", "write", true, nlog)
filewrite nlog, text
fileclose nlog

// completed, exit statement
OnErrorStop
Exit

TRY_AGAIN:
GetLastError error
OnErrorStop
// wait 3 seconds and try again
Wait( 3)
IfThen( try_count, WRITE )
// if we are here, all 3 attempts failed, raise error
RaiseError error

Tue Feb 21, 2012 9:37 am View user's profile Send private message
liangzhu



Joined: 30 Nov 2011
Posts: 42
Country: Singapore

Post does detached affect the queue? Reply with quote
At the moment I've actually tried to enforce that all the jobs are non detached non asynchronous.
Every schedule passes a list of files to the script to be processed.
If the file fails, it gets written to a retry list.
Then another schedule , later, will pass the retry list to the script to process the retries.
When writing retries, it produces retry list 2 if the incoming list is retry list 1.
And at the end of processing any retry list, it deletes the incoming retry list.
We have issues with retry jobs (e.g. retry 2)nt able to delete the incoming list because the previous list is still being written by the previous retry job(i.e. retry 1). we use the existence of the retry lists as semaphore files to trigger the retry jobs.
So everything must queue. Does making a job detached affect queueing?

Also, we use batch files a lot. The batch files do sftp and actually read the file list as sftp commands. We use the batch file to download first, then use jal to work thru the same list to check the downloaded files.
If the previous retry job tries to write to the list while sftp is ongoing, the sftp can hang, then the runandwait that ran the sftp will wait forever because we did not limit the timeout (we don't know what is the maximum time a download can take). And we can't seem to stop the job in this situation if its detached, even if we restart 24x7.

Um, generally will making jobs detached affect the queueing? And if any job for whatever reason encounters a locked file, how do we kill it?

Thanks.
Tue Feb 21, 2012 8:58 pm View user's profile Send private message
SysOp
Site Admin


Joined: 26 Nov 2006
Posts: 7838

Post Reply with quote
Normally all jobs should be set to run in detached mode. It has nothing to do with queuing. Detached jobs are run as separate system processes, not as internal threads. That makes the system more stable, any resource leaks from detached jobs don't impact the scheduler as compared the attached jobs. Only special jobs using JobXXXX commands and similar commands for dynamically modifying the job database should run in attached mode. Hope this helps.
Wed Feb 22, 2012 1:49 am View user's profile Send private message
Display posts from previous:    
Reply to topic    SoftTree Technologies Forum Index » 24x7 Scheduler, Event Server, Automation Suite All times are GMT - 4 Hours
Page 1 of 1

 
Jump to: 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


 

 

Powered by phpBB © 2001, 2005 phpBB Group
Design by Freestyle XL / Flowers Online.