That sounds quite impressive. Can you move common fragments of code to user-defined statements (e.g. to the Script Library)? Parsing is usually a good candidate for such things Don't you want to break this "big" job into multiple pieces just to create "restarting points?" This way you can have more flexibility, if something goes wrong you can restart the processing from the last point of failure instead of being forced to restart the entire thing. The main purpose of multiple job queues in 24x7 is to support multiple concurrent job streams. In other words to place jobs in the queue in order they must be executed. Same result can be achieved using job dependencies. Multiple "little" jobs can run concurrently doing FTP, email, file copy at the same time while 1 "big" job always runs all steps sequentially one after another. Have you seen my comments about @SCRIPT tag? This way you can bypass 32K limit. You can still run and debug jobs the old way, the only difference is that the script is stored outside of the job database. By the way, thank you for your nice comments about 24x7. : Thanks. I can live with the typecasting problem. : I beg to diifer with you on the issue of "large scripts". : At this very moment I am working on a script that : replaces a mainframe script which produces a wide : variety of reports based on about 12 parameters. : The script must parse these parameters for correctness; : notify operators/users of errors; select data for the : reports by either running SQL scripts or running : a wide variety of pre-built programs that select data : in a format used by all the reporting programs; loop in : the data selection process to join together data from : separate sources (e.g. SQL and a program) as well as : retrieving special input files via file copy or FTP : from a variety of sources; run filter : programs against the selected data to enforce : various predefined business rules or security rules; : Run the output report; distribute the output report via : printing, emailing, transfer to COLD, FTP, etc. : to a list of recipients determined by the runtime : parameters; create special tab-delimited version : of selected fields in the data specified by the : parameters that reference these special output : formats in the database; and finally, create : backup copies of all the files that became : involved with the process. : In the mainframe environment where this script grew : up, it saved the Operations department hundreds : of hours per year as almost every run could be : automated. Users even submitted parameter lists : to the job to run via a web site. Scripiting at : this level of generic complexity provided consistent : reports to users and repeatable schedability not possible : in a manual environment. : Yes, this task could be written to run in a : "genuine" programming language; however, it is ideally : suited to a scripting language as the tasks : involved are moving files, running programs, notifying : people, and logging what was done. : The script described above will probably fit in the : 32K limit; however, I will probably have to curtail : comments a bit as they seem to have something to : do with the limit. : In other cases, such as our main nightly proceesing : script which runs about 150 reports and distributes : them, we took the approach of breaking the : job up into about 5 jobs. This is acceptable; however, : it added overhead as paramters had to be maintained : between the pieces of the jobs. : In complex environments with lots of requirements, : complex scripting is often the most cost effective : answer. 24x7 has proved very capable overall; I just : wish the spectre of the 32K limit didn't hamper : the ability to design real-world cost-saving automation : solutions. : Thanks for listening, and please keep up the good work : with 24x7.
|