Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak with getRows() #1194

Open
lkiii opened this issue Jan 7, 2020 · 62 comments
Open

Memory leak with getRows() #1194

lkiii opened this issue Jan 7, 2020 · 62 comments
Labels

Comments

@lkiii
Copy link

lkiii commented Jan 7, 2020

  1. Is it an error or a hang or a crash?
    Out of memory. RAM usage.
  2. What error(s) you are seeing?
    This output is visible after ~20 http rest api calls to the server and each request accesses database via oracledb.
rss: " 1158.07 MB"
heapTotal: " 65.07 MB"
heapUsed: " 42.65 MB"
external: " 1.34 MB"
  1. Include a runnable Node.js script that shows the problem.
    I am closing all the connections and triggering garbage collector manually.
    oracledb is used to call stored procedures and functions with object binds. I am using connection.getDbObjectClass() for the binds and I suspect this could be the cause of the memory issues. It seems that all the issues have started after upgrading to this approach.
  2. Run node and show the output of:
> process.platform
'linux'
> process.version
'v12.14.0'
> process.arch
'x64'
> require('oracledb').versionString
'4.1.0'
> require('oracledb').oracleClientVersionString
'18.3.0.0.0'
  1. What is your Oracle Database version?
    12c

Here is the backstory, how I have come to a conclusion that this is oracledb related issue nodejs/help#2393

@lkiii lkiii added the bug label Jan 7, 2020
@anthony-tuininga
Copy link
Member

We're still going to need a runnable script that demonstrates the problem; otherwise, we are just guessing! There is a known issue with connection.getDbObjectClass() which has not yet been fixed in Node.js (see here) but there might be something else that requires further attention. Please provide a runnable script. Thanks!

@lkiii
Copy link
Author

lkiii commented Jan 7, 2020

I will try to replicate the issue with a small example. It is a part of a custom framework and I need some time to try isolating the issue.

Is there any known workaround on how to force memory release?

@dmcghan
Copy link

dmcghan commented Jan 7, 2020

Also, look through this issue: #1186

In the last message from @zymon, he provides a link to an excel document with memory usage analysis from before and after he implemented the changes I suggested. Before the changes, there was an unreasonable amount of RSS memory consumed.

The changes mostly had to do with the way the JavaScript code worked with connections from the database. For example, don't use Promise.all with database connections. Do you know if you're using Promise.all with database connections?

@lkiii
Copy link
Author

lkiii commented Jan 7, 2020

No there is no Promise.all nowhere in the project. I will try to get back with the code example tomorrow morning.

One http request is qual to one database call at most of the cases.
Here is the simple representation of the flow of the application:
HTTP request -> oracle connect -> getDbObjectClass -> prepare binds -> execute -> get results from cursor (getRows() cursor is being closed after reading the data) -> HTTP response -> oracle connection close

@dmcghan
Copy link

dmcghan commented Jan 7, 2020

That was just a lucky guess. :) There are lots of other ways this could happen.

Could you please share some code with us? Maybe just edit any sensitive parts out?

@lkiii
Copy link
Author

lkiii commented Jan 8, 2020

@dmcghan @anthony-tuininga
I have already prepared a test case: app.zip

To start the application you need to:

  1. cp config\config.example.json config\config.json and change config.json values for database:
   "db": {
        "user": "aaa",
        "password": "aaa",
        "connectString": "192.168.1.1/SID",
        .....
  1. prepare a database with the types and package from database.sql
  2. npm install
  3. npm run build
  4. npm run start
  5. Send http get requests to:
localhost:8081/v1/number-table
localhost:8081/v1/string-table
localhost:8081/v1/object-table

You will see memory usage statistics inside each response

@dmcghan
Copy link

dmcghan commented Jan 8, 2020

The first thing that jumps out at me, if I'm reading the code correctly, is that you're creating a single connection to the database and caching that in your database module. I can see how this may be the source of the trouble.

You'll need to switch to using a connection pool. Update execute so that it gets connections from the pool, uses them, and then releases them. See the simpleExecute method in this post for an example: https://jsao.io/2018/03/creating-a-rest-api-database-basics/

Get rid of methods like commit and rollback, which don't add anything - just use the methods on the connection.

If possible, get rid of setTimeDateLocaleParameters in favor of using the environment variables ORA_SDTZ and NLS_DATE_FORMAT. See the doc here: https://oracle.github.io/node-oracledb/doc/api.html#-15163-fetching-dates-and-timestamps

This will be more important as you move to a connection pool as you don't want to incur two round trips (one per execute) to set these values. There are other options if you can't use environment variables for some reason.

Make those changes and test again. Let us know how it goes.

@lkiii
Copy link
Author

lkiii commented Jan 8, 2020

@dmcghan Thank you for the response. The connection pool is already in our to-do list. I do totally agree that it would be more efficient, but I think that the issue is with connection.getDbObjectClass() because even a single request to the freshly started server has huge rss usage.

Also, the current issue has only appeared after upgrading to binding approach. Before that, the same application used oracledb V3.x and we haven't had rss ram issues.

@dmcghan
Copy link

dmcghan commented Jan 8, 2020

The connection pool is already in our to-do list.

Please try to escalate this ASAP. The current implementation is all kinds of wrong.

because even a single request to the freshly started server has huge rss usage.

Ah, that is interesting. Are you calling this for a lot of classes?

Now that you have a test case, you should be able to test this, no? Find out which classes you're calling the method for (hopefully there's just a few) and get the values it's returning.

Then update the function so that it doesn't actually call the method - just return the values based on the string passed in.

@lkiii
Copy link
Author

lkiii commented Jan 8, 2020

@dmcghan I have recreated the same case without any binds except cursor output and the leak is still there. It is almost visible, but still there. It seems that you will be right regarding leak origin being creating new database connection on each request and then closing it after request. This seems to be another issue, but the connection pool might solve this.

I will upgrade connection handling and let you know about the results.

@dmcghan
Copy link

dmcghan commented Jan 8, 2020

I wouldn't expect to see the RSS issues when creating one-off connections for each request. Though I would expect performance to suck. Use a pool! :)

I'm still new to your code, and I'm not a TS person, but from what I see, generalDBGetHandler creates a new instance of CallParameters. CallParameters uses the OracleDBHelper class, but doesn't create a new instance.

The only method that creates a new instance of OracleDBHelper is initiateDatabase, which I think happens only once. Nothing else calls oracledb.getConnection, so I'm assuming you're sharing a single connection across multiple requests. That's what has me cringing. Again, creating new connections per request is bad for various reasons, but sharing a single connection amongst many connections is also bad, just for different reasons.

Update your code to use a pool and let us know what happens.

@lkiii
Copy link
Author

lkiii commented Jan 8, 2020

I'm still new to your code, and I'm not a TS person, but from what I see, generalDBGetHandler creates a new instance of CallParameters. CallParameters uses the OracleDBHelper class, but doesn't create a new instance.

OracleDBHelper is being created on each http request, it means each time a new connection is being created. After http response is sent connection is being closed. So this still should not have any memory issues.

I wouldn't expect to see the RSS issues when creating one-off connections for each request. Though I would expect performance to suck. Use a pool! :)

The pool priority went up after your comments and will be implemented soon. But there is a memory leak still somewhere with single instance creation even if pool will resolve the issues.

@dmcghan
Copy link

dmcghan commented Jan 8, 2020

OracleDBHelper is being created on each http request

I believe you, but please point out the files and relevant lines.

@lkiii
Copy link
Author

lkiii commented Jan 8, 2020

OracleDBHelper is being created on each http request

I believe you, but please point out the files and relevant lines.

Create connection: App.ts:80
await req.database.initialize() creates new connection before each request handler and sets the value to http request object. Then in the request handler same connection is being used as part of OracleDBHelper to execute, getDbObjectClass, etc.

Close connection: App.ts:124 this happens after each request both on success and error.

@dmcghan
Copy link

dmcghan commented Jan 8, 2020

I see, so App.ts line 69 is being invoked per request. The name initiateDatabase is deceiving.

But this does mean that you're creating a request for each http request and then invoking execute twice just to initialize some session settings (ORA_SDTZ and NLS_DATE_FORMAT). While this is bad, I don't see how it would be responsible for the RSS issues.

Please update the code and let us know how things afterward.

@cjbj
Copy link
Member

cjbj commented Jan 8, 2020

It's worth testing Node.js 13, since the potential fix @anthony-tuininga pointed to only just landed in a Node.js 12 code line last night: nodejs/node#30109

@lkiii
Copy link
Author

lkiii commented Jan 9, 2020

@dmcghan @cjbj I have implemented connection pool and rss ram usage decreased around 30% but this is still way above what is expected. Is there anything I could do to help to find a solution for this issue?

Also, I have tried updating node to V12.14.1 and this had 0 impact on the issue.

The issue is clearly related to getDbObjectClass() because I have done a test when I have converted SQL + binds to SQL (i.e. ("select * from a where a = :a", {a:'a1'}) to ("select * from a where a = ''a1", {})) without binds and the issue was still there.

I am building a new docker image with node V13.x and I will let you know about the results.

@cjbj
Copy link
Member

cjbj commented Jan 9, 2020

@lkiii after you've tried Node 13, you could give us a test case (not an app, e.g. give us a single javascript file) and the SQL to create the tables + data. Also run some tests and plot the results to show whether this is a leak over time. Thanks,

@lkiii
Copy link
Author

lkiii commented Jan 10, 2020

Node v13 had no impact on the RAM issue.

@cjbj Hello, I have provided simple test case. oracletestcase.zip

here is a graph of memory usage
image

@cjbj
Copy link
Member

cjbj commented Jan 10, 2020

@lkiii thanks for that work - but can you remove all the other modules to eliminate them as causes? You'll have to monitor the memory externally.

@lkiii
Copy link
Author

lkiii commented Jan 13, 2020

@cjbj @dmcghan Hello,
Here is latest minimal example with memory issue

Memory usage:
It seems that there is a cap on the memory used, but if there is more than one object it stacks and does not get released using massive amounts of memory

image

Memory
0.65
37.67
45.35
48.24
54.1
60.55
65.32
67.31
73.12
54.42
62.58
67.16
71.38
73.35
79.59
85.31
90.34
98.4
105
113.2
113.1
113.2
113.2
113.2
113.6
113.5
114.2
114.2
114.2
114.3

@j-antonio-rivera
Copy link

Hi @lkiii ,

Thanks for the info and the simplified script! I have tried your latest script on Oracle DB 19.5, Instant Client 19.5, Node 12.13, it ran for about 10 hrs. I'm seeing some bumps in the memory but it gets stabilized. I will try some more combinations to keep trying to replicate this. I'll keep you posted.

issue1194

@lkiii
Copy link
Author

lkiii commented Jan 14, 2020

@j-antonio-rivera I am getting similar results if running the same script for a long time. Memory goes up and down, but the maximum value is always getting bigger by several megabytes. As I had written before it seems that there is some kind of threshold of memory after which memory usage does not increase so drastically, but the issue is that initial calls use some memory and it seems that it is not being released. I think the same thing happens with each different object so the used memory stacks if different objects are being called. I will try to replicate heavier usage with more different objects from the database.

Also, you could try increasing const OBJECT_SAMPLE_SIZE = 1000000 this should give you more visible increase.

Database version used for tests: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production

The same behaviour is visible in both docker container with oracle linux and windows environments

@cjbj
Copy link
Member

cjbj commented Jan 14, 2020

@lkiii then you are not getting similar results, because @j-antonio-rivera didn't see an increase :)

@lkiii
Copy link
Author

lkiii commented Jan 14, 2020

@lkiii then you are not getting similar results, because @j-antonio-rivera didn't see an increase :)

@cjbj I am trying to tell that you do not need to look for 10 hours. The increase is most severe in several hundred first creations of DBObject and that memory is not being released.

As I mentioned I will try to recreate an example with more different DBOjects.

@cjbj
Copy link
Member

cjbj commented Jan 14, 2020

@lkiii sounds good! And let us know your exact Oracle client & server versions.

@lkiii
Copy link
Author

lkiii commented Jan 14, 2020

Server: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production
Client: Tested both 18.3 and 19.5, both gives similar results.

I could also provide docker image with the environment, but the same results are happening on my local windows machine, so the issue seems not to be related to Linux/Windows node differences.

@lkiii
Copy link
Author

lkiii commented Jan 14, 2020

I had built a test script to confirm my guess regarding multiple types and there is no difference with single dbobject. Trying to create a better test case with an isolated issue.

@lkiii
Copy link
Author

lkiii commented Jan 15, 2020

confirmation regarding the issue in oracledb nodejs/help#2393 (comment)

@lkiii
Copy link
Author

lkiii commented Jan 31, 2020

@cjbj here is the test case docker-compose.yml:

version: "3"
services:
  db:
    image: lkii/oracledbtestcase:xe
    shm_size: '1gb'
  node:
    image: lkii/oracledbtestcase:testcase
    container_name: testcase
    links: 
      - db

To start:

docker-compose up db
# Wait until database is ready to use
docker-compose up node
# To get the source code
docker exec testcase cat index.js

The issue is that ram usage does not get back to the initial state. This example uses a small data sample.
initial usage:
image
After all the calls are triggered:
image
After several minutes after all calls are done (this amount is never released):
image

If the same this is being done with ~4000k rows and ~15 fields then ram usage after running 50 calls with the same script increases ram usage to ~500Mb

This is only happening if multiple requests are being processed at the same time.

@cjbj
Copy link
Member

cjbj commented Jan 31, 2020

@lkiii thanks.

The other day, @j-antonio-rivera ran some tests based on your older code. There was some growth with Node.js 12, but with Node 13 memory use was flat.

@j-antonio-rivera
Copy link

Hi @lkiii ,

I have been playing around with .getRows function, I have tried some different combinations and I'm seeing that the memory got stable after some time of execution, like @cjbj mentioned, I'm seeing a constant increment in memory if I use Node 12 but I verified that this got fixed by changing node version to Node 13.7. Memory used increases according to the data I'm requesting.

These are my results:

Object/Cursor - .getRows() (Running 1 Thread)
Node 12.14.1
Fetching 101 rows
840,000 hits
singleThreadNode13Fetch101

Object/Cursor - .getRows() (Running 1 Thread)
Node 13.7
Fetching 10,000 rows
31,541 hits
singleThreadNode13Fetch10k

Object/Cursor - .getRows() (Running 50 Threads)
Node 13.7
Fetching 10,000 rows
50,000 hits
50ThreadNode13Fetch10k

I will post an update with my comparison between .getRows and .getRow findings.

Regards,

@cjbj cjbj changed the title Memory leak in native code Memory leak with getRows() in Node.js 12 Feb 3, 2020
@lkiii
Copy link
Author

lkiii commented Feb 4, 2020

@cjbj @j-antonio-rivera The issue is only clearly visible if the getRows() is being called in parallel. My previous example is not good for testing, because it is waiting until previous getRows() is closed before starting a new database call.

have you managed to test with my docker example?

@dmcghan
Copy link

dmcghan commented Feb 5, 2020

This goes back to the first comment I made regarding Promise.all. Just as you shouldn't be using Promise.all, getRows should not be called "in parallel". I'm using double quotes because a connection can only do one thing at a time, so it will never run in parallel. If you call getRows before the previous call finishes, you're basically asking the JS engine to do for you would you could have done in your code - to queue the work. It will do it, but it will not do it very efficiently.

@cjbj
Copy link
Member

cjbj commented Feb 5, 2020

@lkiii did you try with Node.js 13? I'd like to see what Node.js 12.15 does, when it is released. It has the (or "a"?) N-API leak fix.

And follow what @dmcghan says. Also see the user doc Parallelism on a Connection.

@cjbj
Copy link
Member

cjbj commented Feb 6, 2020

@lkiii Node.js 12.15 is out. Give it a whirl and let us know how it behaves.

@lkiii
Copy link
Author

lkiii commented Feb 6, 2020

For now while(getRow()) completely fixed the leak. All the ram used for database calls is being released and if API is running without a load ram usage is stable and low.

I could try building new docker images with Node.js 12.5 and 13 for you to test if it would help your team to understand what is happening. Have you managed to start it and get similar results as in comment?

@cjbj
Copy link
Member

cjbj commented Feb 6, 2020

@lkiii appreciate the offer (and your continued interest). Our testing framework is not Docker based. @j-antonio-rivera was testing pretty much your code. You mentioned that your problem was caused by parallel use, which isn't apparent in the code you shared. This is why I was interested in you trying out Node.js 12.15 yourself. Of course, we've mentioned why you should remove parallel use; I can imagine cases where it will use more memory. @j-antonio-rivera is about to look at Node.js 12.15 too.

@j-antonio-rivera
Copy link

Hi @lkiii , @cjbj,

Just a quick update on this, I have checked the getRows function against nodejs 12.15, I'm seeing the same RSS memory increment pattern than when nodejs 12.14.1 is being used.

Object/Cursor - .getRows() (Running 50 Threads)
Node 12.15
Fetching 20 rows
1,028,821 hits

50ThreadNode1215

@toan12a
Copy link

toan12a commented Jun 6, 2020

hi all,
im create API with oracledb but while test many request RSS memory increment.

node -v
v11.6.0
npm -v
6.5.0-next.0
"oracledb": "^4.0.0",
Centos 6
here my dbconfig pool:

module.exports = {
    connectionLimit: 100,
    user: process.env.NODE_ORACLEDB_USER || "user",
    password: process.env.NODE_ORACLEDB_PASSWORD || "pass",
    connectString: process.env.NODE_ORACLEDB_CONNECTIONSTRING || "ip/orcl",
};
any one help?

router.post("/get_idx", function(req, res) {
    console.log(
        "################# START get_idx ######################"
    );
    const used = process.memoryUsage();
    for (let key in used) { console.log(`${key} ${Math.round(used[key] / 1024 / 1024 * 100) / 100} MB`); }
    try {
        var jsonRequest = JSON.parse(JSON.stringify(req.body));
        // console.log(jsonRequest);
        var numRows = 1000;
        //  oracledb.fetchAsString = [oracledb.CLOB];
        async function run() {
            let pool;
            try {
                pool = await oracledb.createPool(dbConfig);
                var bindvars = {
                    cursor: { type: oracledb.CURSOR, dir: oracledb.BIND_OUT }
                };
                let connection;
                try {
                    connection = await pool.getConnection();
                    const result = await connection.execute(
                        "BEGIN :cursor := test(); END;",
                        bindvars, { outFormat: oracledb.OBJECT }
                    );
                    const resultSet = result.outBinds.cursor;
                    let rows;
                    do {
                        rows = await resultSet.getRows(numRows); // get numRows rows at a time
                        if (rows.length === 0) {
                            // no rows, or no more rows
                            var jsonResponse = {};
                            jsonResponse["ERROR"] = "0004";
                            jsonResponse["MESSAGE"] = "FAILED";
                            console.log(jsonResponse);
                            res.send(jsonResponse);
                        } else if (rows.length > 0) {
                            var newObj = rows.reduce((a, b) => Object.assign(a, b), {});
                            var jsonpush = {};
                            var jsonResponse = {};
                            var sub_ = [];
                            for (var i = 0; i < rows.length; i++) {
                                jsonpush["SYSDATE"] = rows[i].SYSDATE;
                                sub_.push(JSON.parse(JSON.stringify(jsonpush)));
                            }
                            jsonResponse["ERROR"] = newObj.ERROR;
                            jsonResponse["MESSAGE"] = newObj.MESSAGE;
                            jsonResponse["RESULT"] = newObj.RESULT;
                            jsonResponse["INFO"] = sub_;
                            //
                            //				console.log(jsonResponse);
                            res.send(jsonResponse);
                        }
                    } while (rows.length === numRows);
                    // always close the ResultSet
                    await resultSet.close();
                } catch (err) {
                    console.error(err);
                    var jsonResponse = {};
                    jsonResponse["ERROR"] = "0003";
                    jsonResponse["MESSAGE"] = "FAILED";
                    console.log(jsonResponse);
                    res.send(jsonResponse);
                } finally {
                    if (connection) {
                        try {
                            await connection.close(); // Put the connection back in the pool
                        } catch (err) {
                            console.error(err);
                            var jsonResponse = {};
                            jsonResponse["ERROR"] = "0003";
                            jsonResponse["MESSAGE"] = "FAILED";
                            res.send(jsonResponse); //
                        }
                    }
                }
            } catch (err) {
                console.error(err);
                var jsonResponse = {};
                jsonResponse["ERROR"] = "0003";
                jsonResponse["MESSAGE"] = "FAILED";
                console.log(jsonResponse);
                res.send(jsonResponse);
            } finally {
                await pool.close();
            }
        }
        run();
    } catch (e) {
        console.log("Exception in get_idx: " + e);
        jsonResponse["ERROR"] = "-1";
        jsonResponse["MESSAGE"] = "FAILED";
        res.send(jsonResponse);
    } finally {
        res.status(200);
        console.log("################# get_idx ######################");
    }
});

@cjbj
Copy link
Member

cjbj commented Jun 6, 2020

  • start by upgrading Node.js, since 11.6 probably had some known N-API issue

  • and upgrade node-oracledb

  • When pasting code, use three backticks on lines before & after so it formats. Anthony has just done that for you

@cjbj
Copy link
Member

cjbj commented Jun 6, 2020

And do not create the pool on each request! Create one pool as part of your application initialization outside the router. (It would be helpful if you can tell me why you did this, so I can improve documentation and examples).

@cjbj
Copy link
Member

cjbj commented Jun 6, 2020

Is connectionLimit used by your code? It's not a node-oracledb attribute.

If your table contains an exact multiple of 1000 rows, you will be sending FAILED.

At some stage we reworked the getRows() example to avoid an extra getRows() call. The current resultset1.js has this logic:

    do {
      rows = await rs.getRows(numRows); // get numRows rows at a time
      if (rows.length > 0) {
        console.log("getRows(): Got " + rows.length + " rows");
        console.log(rows);
      }
    } while (rows.length === numRows);

Also you may find the (singular) getRow() is almost as fast as getRows(), which could simplify your implementation. You would tune the fetch performance with oracledb.fetchArraySize instead of numRows.

@toan12a
Copy link

toan12a commented Jun 6, 2020

And do not create the pool on each request! Create one pool as part of your application initialization outside the router. (It would be helpful if you can tell me why you did this, so I can improve documentation and examples).

can u example create pool outsite the router? im have more router.post(...)

im have 2000 row in database
What happens if I exchange 1,000 rows to 100,000,000 rows? is there memory will increase?

@cjbj
Copy link
Member

cjbj commented Jun 6, 2020

The closest example I have is https://github.com/oracle/node-oracledb/blob/master/examples/webappawait.js

What happens if I exchange 1,000 rows to 100,000,000 rows? is there memory will increase?

You mean increase numRows? I wouldn't do that. Just fix your code regarding your end-of-fetch logic.

Do the other things suggested, particularly upgrading, and let us know the outcome.

@j-antonio-rivera
Copy link

Hi,

I have tried recent versions of Node JS 12.18.x, & 14.4.0, also I've used for this execution Oracle Instant Client 19.6 pointing to Oracle DB 19.6, using a single and multiple threads executions. The behavior of the results are similar to what I have reported on my comment on Jan 31, this issue gets improved by the fix of N-API issue 28988

These are my results:

Object/Cursor - .getRows() (Running 1 Thread)
Node 12.18.1
Fetching 20 rows
1 Million of iterations
stats_bug30101757_5_12181_1Thread

Object/Cursor - .getRows() (Running 50 Threads)
Node 12.18.1
Fetching 20 rows
1 Million of iterations
stats_bug30101757_5_12181_50Threads

Object/Cursor - .getRows() (Running 1 Thread)
Node 14.4.0
Fetching 20 rows
1 Million of iterations
stats_bug30101757_5_1440_1Thread

Object/Cursor - .getRows() (Running 50 Threads)
Node 14.4.0
Fetching 20 rows
1 Million of iterations
stats_bug30101757_5_1440_50Threads

I will run a couple of more executions with more iterations since in my older testing I didn't see the minor bumps I'm seeing now in the single threaded executions, I will post and update once I have completed those executions.

@cjbj
Copy link
Member

cjbj commented Jul 6, 2020

After discussion, I'm going to close this issue. The big memory leak does seems to have been the Node.js N-API bug. The fix has trickled into supported Node.js versions. Yay!

We'll continue to monitor and test and resolve issues as we find them.

@karkan20
Copy link

@cjbj Hello, we have finally pinpointed exact leak cause: const rows = await resultSet.outBinds[cursorName].getRows(rowsToGet)

After changing to

    let row
    while (row = await resultSet.outBinds[cursorName].getRow()) {
      rows.push(row)
    }

Memory usage went back to normal.
‐---‐-----------
I found same behavior for getrows consuming lot of rss and not releasing post usage. Switching to getrow helped. Using nodejs 20 and latest version of oracledb 6.5.1 also tried previous versions like 5.5.0 5.3.0

@cjbj cjbj reopened this May 29, 2024
@cjbj
Copy link
Member

cjbj commented May 29, 2024

@karkan20 is it a real leak or did the GC clean up eventually?

I don't recall seeing the CREATE TABLE statement. Can you share that and a current, complete & runnable testcase?

@karkan20
Copy link

@cjbj it looks like a real leak where the object perisits and never gets cleaned up, the memory consumption graph keeps going up. On restarting the nodejs app released the memory.

Not sure about the create table statement.

I am getting resultset and then getting rows in batch of getrows(1024) are fetched and then transform it my requirement, then get another batch and so on.

Even if change the batch size to like 50 it still consumes and never gets released unless app restarts.

@cjbj
Copy link
Member

cjbj commented May 29, 2024

@karkan20 what are the column types?

@karkan20
Copy link

String, date, number

@sharadraju sharadraju changed the title Memory leak with getRows() in Node.js 12 Memory leak with getRows() Jul 15, 2024
@sharadraju
Copy link
Member

@karkan20 is it a real leak or did the GC clean up eventually?

I don't recall seeing the CREATE TABLE statement. Can you share that and a current, complete & runnable testcase?

@karkan20 As indicated by @cjbj, request your help to please share a runnable test case. This will help us get to the root cause of the problem.

@hjamil-24
Copy link
Member

Also, If there are no rows going to be fetched, we should always close the resultSet.
resultSet.close() doc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

9 participants