Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

npm run embed have some error: embeddingResponse.data is '' , undefined is not iterable #6

Open
okerivy opened this issue Mar 4, 2023 · 3 comments

Comments

@okerivy
Copy link

okerivy commented Mar 4, 2023

I ran this command npm run embed and got the following error message.

This line of code will result in an error.
const embeddingResponse = await openai.createEmbedding
It's probably caused by an empty embeddingResponse.data = ''

I have configured .env.local.

OPENAI_API_KEY= mykey

NEXT_PUBLIC_SUPABASE_URL=Project URL URL
SUPABASE_SERVICE_ROLE_KEY=Project API keys service_role (secret)

I have already executed the four SQL commands in Supabase. There is a table called "pg" inside, do I need to set permissions for it?

 $$ npm run embed

> [email protected] embed
> tsx scripts/embed.ts

Loaded env from .env.local

**error:**
paul-graham-gpt/scripts/embed.ts:30
      **const [{ embedding }] = embeddingResponse.data.data;**
                              ^


TypeError: undefined is not iterable (cannot read property Symbol(Symbol.iterator))
    at generateEmbeddings (/paul-graham-gpt/scripts/embed.ts:30:31)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at <anonymous> (paul-graham-gpt/scripts/embed.ts:60:3)

Node.js v18.14.0



**print embeddingResponse log is** 


embeddingResponse
 {
  status: 200,
  statusText: 'Connection established',
  headers: {},
  config: {
    transitional: {
      silentJSONParsing: true,
      forcedJSONParsing: true,
      clarifyTimeoutError: false
    },
    adapter: [Function: httpAdapter],
    transformRequest: [ [Function: transformRequest] ],
    transformResponse: [ [Function: transformResponse] ],
    timeout: 0,
    xsrfCookieName: 'XSRF-TOKEN',
    xsrfHeaderName: 'X-XSRF-TOKEN',
    maxContentLength: -1,
    maxBodyLength: -1,
    validateStatus: [Function: validateStatus],
    headers: {
      Accept: 'application/json, text/plain, */*',
      'Content-Type': 'application/json',
      'User-Agent': 'OpenAI/NodeJS/3.1.0',
      Authorization: 'Bearer sk-mykey',
      'Content-Length': 822,
      host: 'api.openai.com'
    },
    method: 'post',
    data: `{"model":"text-embedding-ada-002","input":"(Someone fed my essays into GPT to make something that could answer questions based on them, then asked it where good ideas come from. The answer was ok, but not what I would have said. This is what I would have said.)The way to get new ideas is to notice anomalies: what seems strange, or missing, or broken? You can see anomalies in everyday life (much of standup comedy is based on this), but the best place to look for them is at the frontiers of knowledge. Knowledge grows fractally. From a distance its edges look smooth, but when you learn enough to get close to one, you'll notice it's full of gaps. These gaps will seem obvious; it will seem inexplicable that no one has tried x or wondered about y. In the best case, exploring such gaps yields whole new fractal buds."}`,
    url: 'https://api.openai.com/v1/embeddings'
  },
  request: <ref *1> ClientRequest {
    _events: [Object: null prototype] {
      abort: [Function (anonymous)],
      aborted: [Function (anonymous)],
      connect: [Function (anonymous)],
      error: [Function (anonymous)],
      socket: [Function (anonymous)],
      timeout: [Function (anonymous)],
      finish: [Function: requestOnFinish]
    },
    _eventsCount: 7,
    _maxListeners: undefined,
    outputData: [],
    outputSize: 0,
    writable: true,
    destroyed: false,
    _last: true,
    chunkedEncoding: false,
    shouldKeepAlive: false,
    maxRequestsOnConnectionReached: false,
    _defaultKeepAlive: true,
    useChunkedEncodingByDefault: true,
    sendDate: false,
    _removedConnection: false,
    _removedContLen: false,
    _removedTE: false,
    strictContentLength: false,
    _contentLength: 822,
    _hasBody: true,
    _trailer: '',
    finished: true,
    _headerSent: true,
    _closed: false,
    socket: Socket {
      connecting: false,
      _hadError: false,
      _parent: null,
      _host: null,
      _closeAfterHandlingError: false,
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 8,
      _maxListeners: undefined,
      _writableState: [WritableState],
      allowHalfOpen: false,
      _sockname: null,
      _pendingData: null,
      _pendingEncoding: '',
      server: null,
      _server: null,
      parser: null,
      _httpMessage: [Circular *1],
      write: [Function: writeAfterFIN],
      [Symbol(async_id_symbol)]: 57,
      [Symbol(kHandle)]: null,
      [Symbol(lastWriteQueueSize)]: 0,
      [Symbol(timeout)]: null,
      [Symbol(kBuffer)]: null,
      [Symbol(kBufferCb)]: null,
      [Symbol(kBufferGen)]: null,
      [Symbol(kCapture)]: false,
      [Symbol(kSetNoDelay)]: true,
      [Symbol(kSetKeepAlive)]: true,
      [Symbol(kSetKeepAliveInitialDelay)]: 60,
      [Symbol(kBytesRead)]: 39,
      [Symbol(kBytesWritten)]: 1121
    },
    _header: 'POST https://api.openai.com/v1/embeddings HTTP/1.1\r\n' +
      'Accept: application/json, text/plain, */*\r\n' +
      'Content-Type: application/json\r\n' +
      'User-Agent: OpenAI/NodeJS/3.1.0\r\n' +
      'Authorization: Bearer sk-mykey\r\n' +
      'Content-Length: 822\r\n' +
      'host: api.openai.com\r\n' +
      'Connection: close\r\n' +
      '\r\n',
    _keepAliveTimeout: 0,
    _onPendingData: [Function: nop],
    agent: Agent {
      _events: [Object: null prototype],
      _eventsCount: 2,
      _maxListeners: undefined,
      defaultPort: 80,
      protocol: 'http:',
      options: [Object: null prototype],
      requests: [Object: null prototype] {},
      sockets: [Object: null prototype],
      freeSockets: [Object: null prototype] {},
      keepAliveMsecs: 1000,
      keepAlive: false,
      maxSockets: Infinity,
      maxFreeSockets: 256,
      scheduling: 'lifo',
      maxTotalSockets: Infinity,
      totalSocketCount: 1,
      [Symbol(kCapture)]: false
    },
    socketPath: undefined,
    method: 'POST',
    maxHeaderSize: undefined,
    insecureHTTPParser: undefined,
    joinDuplicateHeaders: undefined,
    path: 'https://api.openai.com/v1/embeddings',
    _ended: true,
    res: IncomingMessage {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 4,
      _maxListeners: undefined,
      socket: [Socket],
      httpVersionMajor: 1,
      httpVersionMinor: 1,
      httpVersion: '1.1',
      complete: true,
      rawHeaders: [],
      rawTrailers: [],
      joinDuplicateHeaders: undefined,
      aborted: false,
      upgrade: false,
      url: '',
      method: null,
      statusCode: 200,
      statusMessage: 'Connection established',
      client: [Socket],
      _consuming: true,
      _dumped: false,
      req: [Circular *1],
      responseUrl: 'https://api.openai.com/v1/embeddings',
      redirects: [],
      [Symbol(kCapture)]: false,
      [Symbol(kHeaders)]: {},
      [Symbol(kHeadersCount)]: 0,
      [Symbol(kTrailers)]: null,
      [Symbol(kTrailersCount)]: 0
    },
    aborted: false,
    timeoutCb: null,
    upgradeOrConnect: false,
    parser: null,
    maxHeadersCount: null,
    reusedSocket: false,
    host: '127.0.0.1',
    protocol: 'http:',
    _redirectable: Writable {
      _writableState: [WritableState],
      _events: [Object: null prototype],
      _eventsCount: 3,
      _maxListeners: undefined,
      _options: [Object],
      _ended: true,
      _ending: true,
      _redirectCount: 0,
      _redirects: [],
      _requestBodyLength: 822,
      _requestBodyBuffers: [],
      _onNativeResponse: [Function (anonymous)],
      _currentRequest: [Circular *1],
      _currentUrl: 'https://api.openai.com/v1/embeddings',
      [Symbol(kCapture)]: false
    },
    [Symbol(kCapture)]: false,
    [Symbol(kBytesWritten)]: 0,
    [Symbol(kEndCalled)]: true,
    [Symbol(kNeedDrain)]: false,
    [Symbol(corked)]: 0,
    [Symbol(kOutHeaders)]: [Object: null prototype] {
      accept: [Array],
      'content-type': [Array],
      'user-agent': [Array],
      authorization: [Array],
      'content-length': [Array],
      host: [Array]
    },
    [Symbol(errored)]: null,
    [Symbol(kUniqueHeaders)]: null
  },
  data: ''
}



@DevinLin01
Copy link

### when i run this command, it show : connect ETIMEDOUT 108.160.172.200:443

 npm run embed

> [email protected] embed
> tsx scripts/embed.ts

Loaded env from .env.local
node:internal/process/promises:288
            triggerUncaughtException(err, true /* fromPromise */);
            ^

<ref *1> Error: connect ETIMEDOUT 108.160.172.200:443
    at __node_internal_captureLargerStackTrace (node:internal/errors:484:5)
    at __node_internal_exceptionWithHostPort (node:internal/errors:662:12)
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1300:16) {

@yuqiliao
Copy link

yuqiliao commented Aug 6, 2023

I experienced the same error message as @DevinLin01 , by any chance there are updates/fixes to it?

@yuqiliao
Copy link

yuqiliao commented Aug 6, 2023

Just to provide an update here: It seems that the error I mentioned was caused by my vpn, once I switched to a different VPN setting it works now. I also read that switching it off should work too (it will not for my case, as openAI API is not accessible without VPN). Hope it helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants