Sybase NNTP forums - End Of Life (EOL)

The NNTP forums from Sybase - forums.sybase.com - are now closed.

All new questions should be directed to the appropriate forum at the SAP Community Network (SCN).

Individual products have links to the respective forums on SCN, or you can go to SCN and search for your product in the search box (upper right corner) to find your specific developer center.

Out of memory problem!

4 posts in Objects Last posting was on 2008-03-19 08:23:52.0Z
neil Posted on 2008-03-13 09:20:34.0Z
Sender: 5064.47d8ee0a.1804289383@sybase.com
From: Neil
Newsgroups: sybase.public.powerbuilder.objects
Subject: Out of memory problem!
X-Mailer: WebNews to Mail Gateway v1.1t
Message-ID: <47d8f1e1.50e4.1681692777@sybase.com>
NNTP-Posting-Host: 10.22.241.41
X-Original-NNTP-Posting-Host: 10.22.241.41
Date: 13 Mar 2008 01:20:34 -0800
X-Trace: forums-1-dub 1205400034 10.22.241.41 (13 Mar 2008 01:20:34 -0800)
X-Original-Trace: 13 Mar 2008 01:20:34 -0800, 10.22.241.41
Lines: 29
Path: forums-1-dub!not-for-mail
Xref: forums-1-dub sybase.public.powerbuilder.objects:9633
Article PK: 736774

Hi,

I have this problem - out of memory. What I did in my
program is like this:

I have around 1000 files. these files are date aligned.
meaning each file has the same number of days with the
others. The files have the same dates. I process it one by
one thru loop.
inside the loop, I imported each file into a datastore
(ds1), then I have some calculations(computed fields) on
that datastore, so after importing, i called groupcalc() and
then assign the computed result into another datastore (ds2)
that serves as a master. This master datastore does not have
duplicate dates. the values for each date in ds2 are just
adjusted (added or subtracted) from the values of the
corresponding date from ds1. after that, I will reset ds1
and repeat the loop.

I am wondering why eventually, my application runs out of
memory. I do reset ds1 each iteration. Does that make the
memory to bulk? or is there a memory leak here?

Any ideas or opinions are appreciated most.

Note: I am using PB10. ds1 and ds2 are external datastores.

Thanks,
Neil


"John Olson [Team Sybase]" <john.olson Posted on 2008-03-13 23:11:13.0Z
From: "John Olson [Team Sybase]" <john.olson@nospam_teamsybase.com>
Newsgroups: sybase.public.powerbuilder.objects
References: <47d8f1e1.50e4.1681692777@sybase.com>
Subject: Re: Out of memory problem!
Lines: 46
X-Priority: 3
X-MSMail-Priority: Normal
X-Newsreader: Microsoft Outlook Express 6.00.2900.3138
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.3138
X-RFC2646: Format=Flowed; Original
NNTP-Posting-Host: vip152.sybase.com
X-Original-NNTP-Posting-Host: vip152.sybase.com
Message-ID: <47d9b491$1@forums-1-dub>
Date: 13 Mar 2008 15:11:13 -0800
X-Trace: forums-1-dub 1205449873 10.22.241.152 (13 Mar 2008 15:11:13 -0800)
X-Original-Trace: 13 Mar 2008 15:11:13 -0800, vip152.sybase.com
X-Authenticated-User: TeamSybase
Path: forums-1-dub!not-for-mail
Xref: forums-1-dub sybase.public.powerbuilder.objects:9634
Article PK: 736777

It sounds like a leak. Try destroying your ds1 and recreating it every x
files and see if that resolves your problem. On the other hand, it could be
one of the destination datastores that is blowing up. Do a rowcount
occasionally and see if you somehow are adding far more rows than you think
you are putting in the datastores. If it's a huge number of rows and you are
exceeding memory you can try using the SavetoDisk option, which will cache
the rows on disk.

Regards,
John
Team Sybase

<Neil> wrote in message news:47d8f1e1.50e4.1681692777@sybase.com...
> Hi,
>
> I have this problem - out of memory. What I did in my
> program is like this:
>
> I have around 1000 files. these files are date aligned.
> meaning each file has the same number of days with the
> others. The files have the same dates. I process it one by
> one thru loop.
> inside the loop, I imported each file into a datastore
> (ds1), then I have some calculations(computed fields) on
> that datastore, so after importing, i called groupcalc() and
> then assign the computed result into another datastore (ds2)
> that serves as a master. This master datastore does not have
> duplicate dates. the values for each date in ds2 are just
> adjusted (added or subtracted) from the values of the
> corresponding date from ds1. after that, I will reset ds1
> and repeat the loop.
>
> I am wondering why eventually, my application runs out of
> memory. I do reset ds1 each iteration. Does that make the
> memory to bulk? or is there a memory leak here?
>
> Any ideas or opinions are appreciated most.
>
> Note: I am using PB10. ds1 and ds2 are external datastores.
>
> Thanks,
> Neil


Arthur Hefti Posted on 2008-03-14 04:14:15.0Z
From: "Arthur Hefti" <arthur@catsoft.ch>
Newsgroups: sybase.public.powerbuilder.objects
References: <47d8f1e1.50e4.1681692777@sybase.com>
Subject: Re: Out of memory problem!
Lines: 39
X-Priority: 3
X-MSMail-Priority: Normal
X-Newsreader: Microsoft Outlook Express 6.00.2900.3138
X-RFC2646: Format=Flowed; Original
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.3198
NNTP-Posting-Host: vip152.sybase.com
X-Original-NNTP-Posting-Host: vip152.sybase.com
Message-ID: <47d9fb97$1@forums-1-dub>
Date: 13 Mar 2008 20:14:15 -0800
X-Trace: forums-1-dub 1205468055 10.22.241.152 (13 Mar 2008 20:14:15 -0800)
X-Original-Trace: 13 Mar 2008 20:14:15 -0800, vip152.sybase.com
X-Authenticated-User: panorama
Path: forums-1-dub!not-for-mail
Xref: forums-1-dub sybase.public.powerbuilder.objects:9635
Article PK: 736779

Try to call GarbageCollect ( ) about every 10th file. AFAK the automatic
garbage collection isn't done until the loop has finished.

HTH
Arthur

<Neil> wrote in message news:47d8f1e1.50e4.1681692777@sybase.com...
> Hi,
>
> I have this problem - out of memory. What I did in my
> program is like this:
>
> I have around 1000 files. these files are date aligned.
> meaning each file has the same number of days with the
> others. The files have the same dates. I process it one by
> one thru loop.
> inside the loop, I imported each file into a datastore
> (ds1), then I have some calculations(computed fields) on
> that datastore, so after importing, i called groupcalc() and
> then assign the computed result into another datastore (ds2)
> that serves as a master. This master datastore does not have
> duplicate dates. the values for each date in ds2 are just
> adjusted (added or subtracted) from the values of the
> corresponding date from ds1. after that, I will reset ds1
> and repeat the loop.
>
> I am wondering why eventually, my application runs out of
> memory. I do reset ds1 each iteration. Does that make the
> memory to bulk? or is there a memory leak here?
>
> Any ideas or opinions are appreciated most.
>
> Note: I am using PB10. ds1 and ds2 are external datastores.
>
> Thanks,
> Neil


neil Posted on 2008-03-19 08:23:52.0Z
Sender: 7016.47df1de7.1804289383@sybase.com
From: Neil
Newsgroups: sybase.public.powerbuilder.objects
Subject: Re: Out of memory problem!
X-Mailer: WebNews to Mail Gateway v1.1t
Message-ID: <47e0cd98.16e6.1681692777@sybase.com>
References: <47d9fb97$1@forums-1-dub>
NNTP-Posting-Host: 10.22.241.41
X-Original-NNTP-Posting-Host: 10.22.241.41
Date: 19 Mar 2008 00:23:52 -0800
X-Trace: forums-1-dub 1205915032 10.22.241.41 (19 Mar 2008 00:23:52 -0800)
X-Original-Trace: 19 Mar 2008 00:23:52 -0800, 10.22.241.41
Lines: 45
Path: forums-1-dub!not-for-mail
Xref: forums-1-dub sybase.public.powerbuilder.objects:9636
Article PK: 736781

Thanks for the replies. Seems to improve now with
GarbageCollect.

Many Thanks.

> Try to call GarbageCollect ( ) about every 10th file. AFAK
> the automatic garbage collection isn't done until the
> loop has finished.
>
> HTH
> Arthur
>
>
> <Neil> wrote in message
> > news:47d8f1e1.50e4.1681692777@sybase.com... Hi,
> >
> > I have this problem - out of memory. What I did in my
> > program is like this:
> >
> > I have around 1000 files. these files are date aligned.
> > meaning each file has the same number of days with the
> > others. The files have the same dates. I process it one
> > by one thru loop.
> > inside the loop, I imported each file into a datastore
> > (ds1), then I have some calculations(computed fields) on
> > that datastore, so after importing, i called groupcalc()
> > and then assign the computed result into another
> > datastore (ds2) that serves as a master. This master
> > datastore does not have duplicate dates. the values for
> > each date in ds2 are just adjusted (added or subtracted)
> > from the values of the corresponding date from ds1.
> > after that, I will reset ds1 and repeat the loop.
> >
> > I am wondering why eventually, my application runs out
> > of memory. I do reset ds1 each iteration. Does that make
> > the memory to bulk? or is there a memory leak here?
> >
> > Any ideas or opinions are appreciated most.
> >
> > Note: I am using PB10. ds1 and ds2 are external
> datastores. >
> > Thanks,
> > Neil
>
>