You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Followed quick start setup directions for codespace
Once done with setup steps moved on to development
Entered prompt for my app and pressed enter
Model set to gpt-3.5-turbo-1106
Clicked Continue development
Expected - Receive response like "You are using an API key that does not have available tokens. Please add tokens by upgrading your plan in order to proceed."
Actual -
`Do you want to:
Continue development
Provide guidance
Return to Super Coder Menu
Enter your choice (1-3): 1
Continuing development...
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
Python Version: 3.10.13
Pip Version: 24.0
Open-interpreter Version: cmd: Open Interpreter 0.2.3 New Computer Update
, pkg: 0.2.3
OS Version and Architecture: Linux-6.2.0-1019-azure-x86_64-with-glibc2.31
CPU Info: x86_64
RAM Info: 7.74 GB, used: 1.98, free: 0.36
# Interpreter Info
Vision: False
Model: gpt-3.5-turbo-1106
Function calling: True
Context window: 16000
Max tokens: 4096
Auto run: True
API base: None
Offline: False
Curl output: Not local
# Messages
System Message: You are Open Interpreter, a world-class programmer that can complete any goal by executing code.
First, write a plan. Always recap the plan between each code block (you have extreme short-term memory loss, so you need to recap the plan between each message block to retain it).
When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code.
If you want to send data between programming languages, save the data to a txt or json.
You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again.
You can install new packages.
When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in.
Write messages to the user in Markdown.
In general, try to make plans with as few steps as possible. As for actually executing code to carry out that plan, for stateful languages (like python, javascript, shell, but NOT for html which starts from 0 every time) it's critical not to try to do everything in one code block. You should try something, print information about it, then continue from there in tiny, informed steps. You will never get it on the first try, and attempting it in one go will often lead to errors you cant see.
You are capable of any task.
THE COMPUTER API
A python computer module is ALREADY IMPORTED, and can be used for many tasks:
computer.browser.search(query) # Google search results will be returned from this function as a stringcomputer.files.edit(path_to_file, original_text, replacement_text) # Edit a filecomputer.calendar.create_event(title="Meeting", start_date=datetime.datetime.now(), end=datetime.datetime.now() +datetime.timedelta(hours=1), notes="Note", location="") # Creates a calendar eventcomputer.calendar.get_events(start_date=datetime.date.today(), end_date=None) # Get events between dates. If end_date is None, only gets events for start_datecomputer.calendar.delete_event(event_title="Meeting", start_date=datetime.datetime) # Delete a specific event with a matching title and start date, you may need to get use get_events() to find the specific event object firstcomputer.contacts.get_phone_number("John Doe")
computer.contacts.get_email_address("John Doe")
computer.mail.send("john@email.com", "Meeting Reminder", "Reminder that our meeting is at 3pm today.", ["path/to/attachment.pdf", "path/to/attachment2.pdf"]) # Send an email with a optional attachmentscomputer.mail.get(4, unread=True) # Returns the {number} of unread emails, or all emails if False is passedcomputer.mail.unread_count() # Returns the number of unread emailscomputer.sms.send("555-123-4567", "Hello from the computer!") # Send a text message. MUST be a phone number, so use computer.contacts.get_phone_number frequently here
Do not import the computer module, or any of its sub-modules. They are already imported.
User Info{{import getpass
import os
import platform
{'role': 'user', 'type': 'message', 'content': '**redacted by user.**'}
{'role': 'user', 'type': 'message', 'content': 'redacted by user.'}
Traceback (most recent call last):
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 332, in completion
return self.streaming(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 514, in streaming
response = openai_client.chat.completions.create(**data, timeout=timeout)
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 667, in create
return self._post(
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_base_client.py", line 1213, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_base_client.py", line 902, in request
return self._request(
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_base_client.py", line 993, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 999, in completion
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 972, in completion
response = openai_chat_completions.completion(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 420, in completion
raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 234, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2947, in wrapper
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2845, in wrapper
result = original_function(*args, **kwargs)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 2119, in completion
raise exception_type(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 8526, in exception_type
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 7336, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspaces/rUv-dev/super_coder.py", line 141, in
launch_super_coder()
File "/workspaces/rUv-dev/super_coder.py", line 32, in launch_super_coder
response = interpreter.chat(f"Continue developing the {app_name} application based on the prompt: {app_prompt}")
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 167, in chat
for _ in self._streaming_chat(message=message, display=display):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 196, in _streaming_chat
yield from terminal_interface(self, message)
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/terminal_interface/terminal_interface.py", line 136, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 235, in _streaming_chat
yield from self._respond_and_store()
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 281, in _respond_and_store
for chunk in respond(self):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/respond.py", line 69, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 204, in run
yield from run_function_calling_llm(self, params)
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
for chunk in llm.completions(**request_params):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 237, in fixed_litellm_completions
raise first_error
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 218, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2947, in wrapper
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2845, in wrapper
result = original_function(*args, **kwargs)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 2119, in completion
raise exception_type(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 8526, in exception_type
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 7367, in exception_type
raise RateLimitError(
litellm.exceptions.RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}
Super Coder session completed successfully!`
The text was updated successfully, but these errors were encountered:
Expected - Receive response like "You are using an API key that does not have available tokens. Please add tokens by upgrading your plan in order to proceed."
Actual -
`Do you want to:
Continue development
Provide guidance
Return to Super Coder Menu
Enter your choice (1-3): 1
Continuing development...
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
, pkg: 0.2.3
OS Version and Architecture: Linux-6.2.0-1019-azure-x86_64-with-glibc2.31
CPU Info: x86_64
RAM Info: 7.74 GB, used: 1.98, free: 0.36
First, write a plan. Always recap the plan between each code block (you have extreme short-term memory loss, so you need to recap the plan between each message block to retain it).
When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code.
If you want to send data between programming languages, save the data to a txt or json.
You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again.
You can install new packages.
When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in.
Write messages to the user in Markdown.
In general, try to make plans with as few steps as possible. As for actually executing code to carry out that plan, for stateful languages (like python, javascript, shell, but NOT for html which starts from 0 every time) it's critical not to try to do everything in one code block. You should try something, print information about it, then continue from there in tiny, informed steps. You will never get it on the first try, and attempting it in one go will often lead to errors you cant see.
You are capable of any task.
THE COMPUTER API
A python
computer
module is ALREADY IMPORTED, and can be used for many tasks:Do not import the computer module, or any of its sub-modules. They are already imported.
User Info{{import getpass
import os
import platform
print(f"Name: {getpass.getuser()}")
print(f"CWD: {os.getcwd()}")
print(f"SHELL: {os.environ.get('SHELL')}")
print(f"OS: {platform.system()}")
}}
{'role': 'user', 'type': 'message', 'content': 'redacted by user.'}
Traceback (most recent call last):
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 414, in completion
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 332, in completion
return self.streaming(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 514, in streaming
response = openai_client.chat.completions.create(**data, timeout=timeout)
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 667, in create
return self._post(
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_base_client.py", line 1213, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_base_client.py", line 902, in request
return self._request(
File "/home/codespace/.python/current/lib/python3.10/site-packages/openai/_base_client.py", line 993, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 999, in completion
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 972, in completion
response = openai_chat_completions.completion(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/llms/openai.py", line 420, in completion
raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 234, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2947, in wrapper
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2845, in wrapper
result = original_function(*args, **kwargs)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 2119, in completion
raise exception_type(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 8526, in exception_type
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 7336, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspaces/rUv-dev/super_coder.py", line 141, in
launch_super_coder()
File "/workspaces/rUv-dev/super_coder.py", line 32, in launch_super_coder
response = interpreter.chat(f"Continue developing the {app_name} application based on the prompt: {app_prompt}")
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 167, in chat
for _ in self._streaming_chat(message=message, display=display):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 196, in _streaming_chat
yield from terminal_interface(self, message)
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/terminal_interface/terminal_interface.py", line 136, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 235, in _streaming_chat
yield from self._respond_and_store()
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/core.py", line 281, in _respond_and_store
for chunk in respond(self):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/respond.py", line 69, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 204, in run
yield from run_function_calling_llm(self, params)
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
for chunk in llm.completions(**request_params):
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 237, in fixed_litellm_completions
raise first_error
File "/home/codespace/.python/current/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 218, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2947, in wrapper
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 2845, in wrapper
result = original_function(*args, **kwargs)
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/main.py", line 2119, in completion
raise exception_type(
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 8526, in exception_type
raise e
File "/home/codespace/.python/current/lib/python3.10/site-packages/litellm/utils.py", line 7367, in exception_type
raise RateLimitError(
litellm.exceptions.RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}
Super Coder session completed successfully!`
The text was updated successfully, but these errors were encountered: