Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UploadException Error #1849

Open
JoshShailes opened this issue Jan 15, 2025 · 10 comments
Open

UploadException Error #1849

JoshShailes opened this issue Jan 15, 2025 · 10 comments
Labels
bug Something isn't working question Further information is requested

Comments

@JoshShailes
Copy link

Hi, I'm trying to run the code in the docs for Get Started and apart from on my very first run have been unable to upload the results to ragas app. I'm wondering if anyone has experienced this before and may have a fix? Or a way that I can test my connection to the api etc? Thanks in advance!

ERROR
UploadException: Failed to upload results: {"status":"error","status_code":500,"message":"An internal server error occured"}

@JoshShailes JoshShailes added the question Further information is requested label Jan 15, 2025
@dosubot dosubot bot added the bug Something isn't working label Jan 15, 2025
@jjmachan
Copy link
Member

hey @JoshShailes - we are keeping an eye on this but haven't been able to reporduce this reliably which is the problem.

would you tell us which version of Ragas you are using and then

  1. update to the latest version
  2. generate a new evaluation
  3. try upload again

let me know if this still persists - adding @ganeshrvel too to keep an eye on this

@ganeshrvel
Copy link
Contributor

Hi @JoshShailes,

We have made a new version release that touched the upload feature. Could you upgrade Ragas to 0.2.11 and try the above steps mentioned by @jjmachan?

@JoshShailes
Copy link
Author

JoshShailes commented Jan 20, 2025

Hi @ganeshrvel @jjmachan , I have version 0.2.11. I seem to get this issue when using gpt-3.5-turbo-16k as my evaluator llm. When using this model I get some StringIO errors during the evaluate() step and then get the upload error.

Maybe you can reproduce the error using this llm as the evaluator. I am also using Azure OpenAI.

The Error during the evaluate step: Exception raised in Job[40]: AttributeError('StringIO' object has no attribute 'verdict')

When I used gpt-4o as the evaluator llm, I do not get the AttributeError and then the upload is successful.

@jjmachan
Copy link
Member

jjmachan commented Jan 20, 2025

@JoshShailes could you run this again with raise_exceptions=True and share the traces?

this is a bug on our end and we'll get it fixed

@JoshShailes
Copy link
Author

@jjmachan The evaluate function:

AttributeError                            Traceback (most recent call last)
Cell In[10], line 2
      1 from ragas import evaluate
----> 2 results = evaluate(eval_dataset, metrics=[metric], raise_exceptions=True)
      3 results

File ~\Ragas\Lib\site-packages\ragas\_analytics.py:227, in track_was_completed.<locals>.wrapper(*args, **kwargs)
    224 @wraps(func)
    225 def wrapper(*args: P.args, **kwargs: P.kwargs) -> t.Any:
    226     track(IsCompleteEvent(event_type=func.__name__, is_completed=False))
--> 227     result = func(*args, **kwargs)
    228     track(IsCompleteEvent(event_type=func.__name__, is_completed=True))
    230     return result

File ~\Ragas\Lib\site-packages\ragas\evaluation.py:318, in evaluate(dataset, metrics, llm, embeddings, experiment_name, callbacks, run_config, token_usage_parser, raise_exceptions, column_map, show_progress, batch_size, _run_id, _pbar)
    315     if not evaluation_group_cm.ended:
    316         evaluation_rm.on_chain_error(e)
--> 318     raise e
    319 else:
    320     # evalution run was successful
    321     # now lets process the results
    322     cost_cb = ragas_callbacks["cost_cb"] if "cost_cb" in ragas_callbacks else None

File ~\Ragas\Lib\site-packages\ragas\evaluation.py:298, in evaluate(dataset, metrics, llm, embeddings, experiment_name, callbacks, run_config, token_usage_parser, raise_exceptions, column_map, show_progress, batch_size, _run_id, _pbar)
    295 scores: t.List[t.Dict[str, t.Any]] = []
    296 try:
    297     # get the results
--> 298     results = executor.results()
    299     if results == []:
    300         raise ExceptionInRunner()

File ~\Ragas\Lib\site-packages\ragas\executor.py:213, in Executor.results(self)
    210             nest_asyncio.apply()
    211             self._nest_asyncio_applied = True
--> 213 results = asyncio.run(self._process_jobs())
    214 sorted_results = sorted(results, key=lambda x: x[0])
    215 return [r[1] for r in sorted_results]

File ~\Ragas\Lib\site-packages\nest_asyncio.py:30, in _patch_asyncio.<locals>.run(main, debug)
     28 task = asyncio.ensure_future(main)
     29 try:
---> 30     return loop.run_until_complete(task)
     31 finally:
     32     if not task.done():

File ~\Ragas\Lib\site-packages\nest_asyncio.py:98, in _patch_loop.<locals>.run_until_complete(self, future)
     95 if not f.done():
     96     raise RuntimeError(
     97         'Event loop stopped before Future completed.')
---> 98 return f.result()

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\futures.py:203, in Future.result(self)
    201 self.__log_traceback = False
    202 if self._exception is not None:
--> 203     raise self._exception.with_traceback(self._exception_tb)
    204 return self._result

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:277, in Task.__step(***failed resolving arguments***)
    273 try:
    274     if exc is None:
    275         # We use the `send` method directly, because coroutines
    276         # don't have `__iter__` and `__next__` methods.
--> 277         result = coro.send(None)
    278     else:
    279         result = coro.throw(exc)

File ~\Ragas\Lib\site-packages\ragas\executor.py:141, in Executor._process_jobs(self)
    135 if self.pbar is None:
    136     with tqdm(
    137         total=len(self.jobs),
    138         desc=self.desc,
    139         disable=not self.show_progress,
    140     ) as internal_pbar:
--> 141         await self._process_coroutines(
    142             self.jobs, internal_pbar, results, max_workers
    143         )
    144 else:
    145     await self._process_coroutines(
    146         self.jobs, self.pbar, results, max_workers
    147     )

File ~\Ragas\Lib\site-packages\ragas\executor.py:191, in Executor._process_coroutines(self, jobs, pbar, results, max_workers)
    189 coroutines = [afunc(*args, **kwargs) for afunc, args, kwargs, _ in jobs]
    190 for future in await as_completed(coroutines, max_workers):
--> 191     result = await future
    192     results.append(result)
    193     pbar.update(1)

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:615, in as_completed.<locals>._wait_for_one()
    612 if f is None:
    613     # Dummy value from _on_timeout().
    614     raise exceptions.TimeoutError
--> 615 return f.result()

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\futures.py:203, in Future.result(self)
    201 self.__log_traceback = False
    202 if self._exception is not None:
--> 203     raise self._exception.with_traceback(self._exception_tb)
    204 return self._result

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:277, in Task.__step(***failed resolving arguments***)
    273 try:
    274     if exc is None:
    275         # We use the `send` method directly, because coroutines
    276         # don't have `__iter__` and `__next__` methods.
--> 277         result = coro.send(None)
    278     else:
    279         result = coro.throw(exc)

File ~\Ragas\Lib\site-packages\ragas\executor.py:48, in as_completed.<locals>.sema_coro(coro)
     46 async def sema_coro(coro):
     47     async with semaphore:
---> 48         return await coro

File ~\Ragas\Lib\site-packages\ragas\executor.py:100, in Executor.wrap_callable_with_index.<locals>.wrapped_callable_async(*args, **kwargs)
     98 except Exception as e:
     99     if self.raise_exceptions:
--> 100         raise e
    101     else:
    102         exec_name = type(e).__name__

File ~\Ragas\Lib\site-packages\ragas\executor.py:96, in Executor.wrap_callable_with_index.<locals>.wrapped_callable_async(*args, **kwargs)
     92 async def wrapped_callable_async(
     93     *args, **kwargs
     94 ) -> t.Tuple[int, t.Callable | float]:
     95     try:
---> 96         result = await callable(*args, **kwargs)
     97         return counter, result
     98     except Exception as e:

File ~\Ragas\Lib\site-packages\ragas\metrics\base.py:541, in SingleTurnMetric.single_turn_ascore(self, sample, callbacks, timeout)
    539     if not group_cm.ended:
    540         rm.on_chain_error(e)
--> 541     raise e
    542 else:
    543     if not group_cm.ended:

File ~\Ragas\Lib\site-packages\ragas\metrics\base.py:534, in SingleTurnMetric.single_turn_ascore(self, sample, callbacks, timeout)
    527 rm, group_cm = new_group(
    528     self.name,
    529     inputs=sample.to_dict(),
    530     callbacks=callbacks,
    531     metadata={"type": ChainType.METRIC},
    532 )
    533 try:
--> 534     score = await asyncio.wait_for(
    535         self._single_turn_ascore(sample=sample, callbacks=group_cm),
    536         timeout=timeout,
    537     )
    538 except Exception as e:
    539     if not group_cm.ended:

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:489, in wait_for(fut, timeout)
    486         raise
    488 if fut.done():
--> 489     return fut.result()
    490 else:
    491     fut.remove_done_callback(cb)

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\futures.py:203, in Future.result(self)
    201 self.__log_traceback = False
    202 if self._exception is not None:
--> 203     raise self._exception.with_traceback(self._exception_tb)
    204 return self._result

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:277, in Task.__step(***failed resolving arguments***)
    273 try:
    274     if exc is None:
    275         # We use the `send` method directly, because coroutines
    276         # don't have `__iter__` and `__next__` methods.
--> 277         result = coro.send(None)
    278     else:
    279         result = coro.throw(exc)

File ~\Ragas\Lib\site-packages\ragas\metrics\_aspect_critic.py:171, in AspectCritic._single_turn_ascore(self, sample, callbacks)
    167 async def _single_turn_ascore(
    168     self, sample: SingleTurnSample, callbacks: Callbacks
    169 ) -> float:
    170     row = sample.to_dict()
--> 171     return await self._ascore(row, callbacks)

File ~\Ragas\Lib\site-packages\ragas\metrics\_aspect_critic.py:196, in AspectCritic._ascore(self, row, callbacks)
    182 prompt_input = AspectCriticInput(
    183     user_input=user_input,
    184     response=response,
   (...)
    187     reference_contexts=reference_contexts,
    188 )
    190 response = await self.single_turn_prompt.generate(
    191     data=prompt_input,
    192     llm=self.llm,
    193     callbacks=callbacks,
    194 )
--> 196 return self._compute_score([response])

File ~\Ragas\Lib\site-packages\ragas\metrics\_aspect_critic.py:163, in AspectCritic._compute_score(self, safe_loaded_responses)
    159     score = Counter(
    160         [item.verdict for item in safe_loaded_responses]
    161     ).most_common(1)[0][0]
    162 else:
--> 163     score = safe_loaded_responses[0].verdict
    165 return score

File ~\Ragas\Lib\site-packages\pydantic\main.py:891, in BaseModel.__getattr__(self, item)
    888     return super().__getattribute__(item)  # Raises AttributeError if appropriate
    889 else:
    890     # this is the current error
--> 891     raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')

AttributeError: 'StringIO' object has no attribute 'verdict'

@jjmachan
Copy link
Member

Hey @JoshShailes thanks a lot for that - I'm not able to reproduce this with "gpt-3.5-turbo"

from ragas.llms import LangchainLLMWrapper
from ragas.embeddings import LangchainEmbeddingsWrapper
from langchain_openai import ChatOpenAI
from langchain_openai import OpenAIEmbeddings
from ragas import SingleTurnSample
from ragas.metrics import AspectCritic

evaluator_llm = LangchainLLMWrapper(ChatOpenAI(model="gpt-3.5-turbo"))
evaluator_embeddings = LangchainEmbeddingsWrapper(OpenAIEmbeddings())

test_data = {
    "user_input": "summarise given text\nThe company reported an 8% rise in Q3 2024, driven by strong performance in the Asian market. Sales in this region have significantly contributed to the overall growth. Analysts attribute this success to strategic marketing and product localization. The positive trend in the Asian market is expected to continue into the next quarter.",
    "response": "The company experienced an 8% increase in Q3 2024, largely due to effective marketing strategies and product adaptation, with expectations of continued growth in the coming quarter.",
}

metric = AspectCritic(name="summary_accuracy",llm=evaluator_llm, definition="Verify if the summary is accurate.")
test_data = SingleTurnSample(**test_data)
await metric.single_turn_ascore(test_data)

this is why I was running. If you have a codesnipet that you could share, that would be really helpfu

@JoshShailes
Copy link
Author

Hi @jjmachan I was running the code for a dataset, you will need to add your llm to this snippet:

from datasets import load_dataset
from ragas import EvaluationDataset
from ragas.metrics import AspectCritic
from ragas import evaluate

eval_dataset = load_dataset("explodinggradients/earning_report_summary",split="train")
eval_dataset = EvaluationDataset.from_hf_dataset(eval_dataset)

print("Features in dataset:", eval_dataset.features())
print("Total samples in dataset:", len(eval_dataset))

metric = AspectCritic(name="summary_accuracy",llm=evaluator_llm, definition="Verify if the summary is accurate.")


results = evaluate(eval_dataset, metrics=[metric], raise_exceptions=True)

@jjmachan
Copy link
Member

thanks alot @JoshShailes the root cause of this issue is #1831 so will be fixing that shortly and will cut a new release

@jjmachan
Copy link
Member

@JoshShailes fixed with fix: output parser bug by jjmachan · Pull Request #1864 · explodinggradients/ragas do try it out and feel free to close it if it is fixed for you 🙂

@JoshShailes
Copy link
Author

Hi @jjmachan , many thanks. I have updated to version 0.2.12 but I get another error now. I'm guessing it is because the response from gpt 3.5 is not in a JSON format as expected by the Parser.

---------------------------------------------------------------------------
JSONDecodeError                           Traceback (most recent call last)
File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\json.py:83, in JsonOutputParser.parse_result(self, result, partial)
     82 try:
---> 83     return parse_json_markdown(text)
     84 except JSONDecodeError as e:

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\utils\json.py:144, in parse_json_markdown(json_string, parser)
    143     json_str = json_string if match is None else match.group(2)
--> 144 return _parse_json(json_str, parser=parser)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\utils\json.py:160, in _parse_json(json_str, parser)
    159 # Parse the JSON string into a Python dictionary
--> 160 return parser(json_str)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\utils\json.py:118, in parse_partial_json(s, strict)
    115 # If we got here, we ran out of characters to remove
    116 # and still couldn't parse the string as JSON, so return the parse error
    117 # for the original string.
--> 118 return json.loads(s, strict=strict)

File ~\AppData\Local\Programs\Python\Python311\Lib\json\__init__.py:359, in loads(s, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    358     kw['parse_constant'] = parse_constant
--> 359 return cls(**kw).decode(s)

File ~\AppData\Local\Programs\Python\Python311\Lib\json\decoder.py:337, in JSONDecoder.decode(self, s, _w)
    333 """Return the Python representation of ``s`` (a ``str`` instance
    334 containing a JSON document).
    335 
    336 """
--> 337 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    338 end = _w(s, end).end()

File ~\AppData\Local\Programs\Python\Python311\Lib\json\decoder.py:353, in JSONDecoder.raw_decode(self, s, idx)
    352 try:
--> 353     obj, end = self.scan_once(s, idx)
    354 except StopIteration as err:

JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)

The above exception was the direct cause of the following exception:

OutputParserException                     Traceback (most recent call last)
File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\prompt\pydantic_prompt.py:401, in RagasOutputParser.parse_output_string(self, output_string, prompt_value, llm, callbacks, retries_left)
    400     jsonstr = extract_json(output_string)
--> 401     result = super().parse(jsonstr)
    402 except OutputParserException:

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\pydantic.py:83, in PydanticOutputParser.parse(self, text)
     75 """Parse the output of an LLM call to a pydantic object.
     76 
     77 Args:
   (...)
     81     The parsed pydantic object.
     82 """
---> 83 return super().parse(text)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\json.py:97, in JsonOutputParser.parse(self, text)
     89 """Parse the output of an LLM call to a JSON object.
     90 
     91 Args:
   (...)
     95     The parsed JSON object.
     96 """
---> 97 return self.parse_result([Generation(text=text)])

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\pydantic.py:72, in PydanticOutputParser.parse_result(self, result, partial)
     71     return None
---> 72 raise e

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\pydantic.py:67, in PydanticOutputParser.parse_result(self, result, partial)
     66 try:
---> 67     json_object = super().parse_result(result)
     68     return self._parse_obj(json_object)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\json.py:86, in JsonOutputParser.parse_result(self, result, partial)
     85 msg = f"Invalid json output: {text}"
---> 86 raise OutputParserException(msg, llm_output=text) from e

OutputParserException: Invalid json output: {'reason': 'The summary accurately captures the main points of the given text.', 'verdict': 1}
For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/OUTPUT_PARSING_FAILURE 

During handling of the above exception, another exception occurred:

JSONDecodeError                           Traceback (most recent call last)
File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\json.py:83, in JsonOutputParser.parse_result(self, result, partial)
     82 try:
---> 83     return parse_json_markdown(text)
     84 except JSONDecodeError as e:

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\utils\json.py:144, in parse_json_markdown(json_string, parser)
    143     json_str = json_string if match is None else match.group(2)
--> 144 return _parse_json(json_str, parser=parser)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\utils\json.py:160, in _parse_json(json_str, parser)
    159 # Parse the JSON string into a Python dictionary
--> 160 return parser(json_str)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\utils\json.py:118, in parse_partial_json(s, strict)
    115 # If we got here, we ran out of characters to remove
    116 # and still couldn't parse the string as JSON, so return the parse error
    117 # for the original string.
--> 118 return json.loads(s, strict=strict)

File ~\AppData\Local\Programs\Python\Python311\Lib\json\__init__.py:359, in loads(s, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    358     kw['parse_constant'] = parse_constant
--> 359 return cls(**kw).decode(s)

File ~\AppData\Local\Programs\Python\Python311\Lib\json\decoder.py:337, in JSONDecoder.decode(self, s, _w)
    333 """Return the Python representation of ``s`` (a ``str`` instance
    334 containing a JSON document).
    335 
    336 """
--> 337 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    338 end = _w(s, end).end()

File ~\AppData\Local\Programs\Python\Python311\Lib\json\decoder.py:353, in JSONDecoder.raw_decode(self, s, idx)
    352 try:
--> 353     obj, end = self.scan_once(s, idx)
    354 except StopIteration as err:

JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)

The above exception was the direct cause of the following exception:

OutputParserException                     Traceback (most recent call last)
Cell In[17], line 2
      1 from ragas import evaluate
----> 2 results = evaluate(eval_dataset, metrics=[metric], raise_exceptions=True)
      3 results

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\_analytics.py:227, in track_was_completed.<locals>.wrapper(*args, **kwargs)
    224 @wraps(func)
    225 def wrapper(*args: P.args, **kwargs: P.kwargs) -> t.Any:
    226     track(IsCompleteEvent(event_type=func.__name__, is_completed=False))
--> 227     result = func(*args, **kwargs)
    228     track(IsCompleteEvent(event_type=func.__name__, is_completed=True))
    230     return result

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\evaluation.py:318, in evaluate(dataset, metrics, llm, embeddings, experiment_name, callbacks, run_config, token_usage_parser, raise_exceptions, column_map, show_progress, batch_size, _run_id, _pbar)
    315     if not evaluation_group_cm.ended:
    316         evaluation_rm.on_chain_error(e)
--> 318     raise e
    319 else:
    320     # evalution run was successful
    321     # now lets process the results
    322     cost_cb = ragas_callbacks["cost_cb"] if "cost_cb" in ragas_callbacks else None

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\evaluation.py:298, in evaluate(dataset, metrics, llm, embeddings, experiment_name, callbacks, run_config, token_usage_parser, raise_exceptions, column_map, show_progress, batch_size, _run_id, _pbar)
    295 scores: t.List[t.Dict[str, t.Any]] = []
    296 try:
    297     # get the results
--> 298     results = executor.results()
    299     if results == []:
    300         raise ExceptionInRunner()

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\executor.py:213, in Executor.results(self)
    210             nest_asyncio.apply()
    211             self._nest_asyncio_applied = True
--> 213 results = asyncio.run(self._process_jobs())
    214 sorted_results = sorted(results, key=lambda x: x[0])
    215 return [r[1] for r in sorted_results]

File ~\Documents\Ragas2\Ragas\Lib\site-packages\nest_asyncio.py:30, in _patch_asyncio.<locals>.run(main, debug)
     28 task = asyncio.ensure_future(main)
     29 try:
---> 30     return loop.run_until_complete(task)
     31 finally:
     32     if not task.done():

File ~\Documents\Ragas2\Ragas\Lib\site-packages\nest_asyncio.py:98, in _patch_loop.<locals>.run_until_complete(self, future)
     95 if not f.done():
     96     raise RuntimeError(
     97         'Event loop stopped before Future completed.')
---> 98 return f.result()

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\futures.py:203, in Future.result(self)
    201 self.__log_traceback = False
    202 if self._exception is not None:
--> 203     raise self._exception.with_traceback(self._exception_tb)
    204 return self._result

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:277, in Task.__step(***failed resolving arguments***)
    273 try:
    274     if exc is None:
    275         # We use the `send` method directly, because coroutines
    276         # don't have `__iter__` and `__next__` methods.
--> 277         result = coro.send(None)
    278     else:
    279         result = coro.throw(exc)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\executor.py:141, in Executor._process_jobs(self)
    135 if self.pbar is None:
    136     with tqdm(
    137         total=len(self.jobs),
    138         desc=self.desc,
    139         disable=not self.show_progress,
    140     ) as internal_pbar:
--> 141         await self._process_coroutines(
    142             self.jobs, internal_pbar, results, max_workers
    143         )
    144 else:
    145     await self._process_coroutines(
    146         self.jobs, self.pbar, results, max_workers
    147     )

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\executor.py:191, in Executor._process_coroutines(self, jobs, pbar, results, max_workers)
    189 coroutines = [afunc(*args, **kwargs) for afunc, args, kwargs, _ in jobs]
    190 for future in await as_completed(coroutines, max_workers):
--> 191     result = await future
    192     results.append(result)
    193     pbar.update(1)

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:615, in as_completed.<locals>._wait_for_one()
    612 if f is None:
    613     # Dummy value from _on_timeout().
    614     raise exceptions.TimeoutError
--> 615 return f.result()

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\futures.py:203, in Future.result(self)
    201 self.__log_traceback = False
    202 if self._exception is not None:
--> 203     raise self._exception.with_traceback(self._exception_tb)
    204 return self._result

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:277, in Task.__step(***failed resolving arguments***)
    273 try:
    274     if exc is None:
    275         # We use the `send` method directly, because coroutines
    276         # don't have `__iter__` and `__next__` methods.
--> 277         result = coro.send(None)
    278     else:
    279         result = coro.throw(exc)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\executor.py:48, in as_completed.<locals>.sema_coro(coro)
     46 async def sema_coro(coro):
     47     async with semaphore:
---> 48         return await coro

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\executor.py:100, in Executor.wrap_callable_with_index.<locals>.wrapped_callable_async(*args, **kwargs)
     98 except Exception as e:
     99     if self.raise_exceptions:
--> 100         raise e
    101     else:
    102         exec_name = type(e).__name__

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\executor.py:96, in Executor.wrap_callable_with_index.<locals>.wrapped_callable_async(*args, **kwargs)
     92 async def wrapped_callable_async(
     93     *args, **kwargs
     94 ) -> t.Tuple[int, t.Callable | float]:
     95     try:
---> 96         result = await callable(*args, **kwargs)
     97         return counter, result
     98     except Exception as e:

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\metrics\base.py:541, in SingleTurnMetric.single_turn_ascore(self, sample, callbacks, timeout)
    539     if not group_cm.ended:
    540         rm.on_chain_error(e)
--> 541     raise e
    542 else:
    543     if not group_cm.ended:

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\metrics\base.py:534, in SingleTurnMetric.single_turn_ascore(self, sample, callbacks, timeout)
    527 rm, group_cm = new_group(
    528     self.name,
    529     inputs=sample.to_dict(),
    530     callbacks=callbacks,
    531     metadata={"type": ChainType.METRIC},
    532 )
    533 try:
--> 534     score = await asyncio.wait_for(
    535         self._single_turn_ascore(sample=sample, callbacks=group_cm),
    536         timeout=timeout,
    537     )
    538 except Exception as e:
    539     if not group_cm.ended:

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:489, in wait_for(fut, timeout)
    486         raise
    488 if fut.done():
--> 489     return fut.result()
    490 else:
    491     fut.remove_done_callback(cb)

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\futures.py:203, in Future.result(self)
    201 self.__log_traceback = False
    202 if self._exception is not None:
--> 203     raise self._exception.with_traceback(self._exception_tb)
    204 return self._result

File ~\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:277, in Task.__step(***failed resolving arguments***)
    273 try:
    274     if exc is None:
    275         # We use the `send` method directly, because coroutines
    276         # don't have `__iter__` and `__next__` methods.
--> 277         result = coro.send(None)
    278     else:
    279         result = coro.throw(exc)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\metrics\_aspect_critic.py:171, in AspectCritic._single_turn_ascore(self, sample, callbacks)
    167 async def _single_turn_ascore(
    168     self, sample: SingleTurnSample, callbacks: Callbacks
    169 ) -> float:
    170     row = sample.to_dict()
--> 171     return await self._ascore(row, callbacks)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\metrics\_aspect_critic.py:190, in AspectCritic._ascore(self, row, callbacks)
    180 reference_contexts = row.get("reference_contexts")
    182 prompt_input = AspectCriticInput(
    183     user_input=user_input,
    184     response=response,
   (...)
    187     reference_contexts=reference_contexts,
    188 )
--> 190 response = await self.single_turn_prompt.generate(
    191     data=prompt_input,
    192     llm=self.llm,
    193     callbacks=callbacks,
    194 )
    196 return self._compute_score([response])

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\prompt\pydantic_prompt.py:127, in PydanticPrompt.generate(self, llm, data, temperature, stop, callbacks, retries_left)
    124 callbacks = callbacks or []
    126 # this is just a special case of generate_multiple
--> 127 output_single = await self.generate_multiple(
    128     llm=llm,
    129     data=data,
    130     n=1,
    131     temperature=temperature,
    132     stop=stop,
    133     callbacks=callbacks,
    134     retries_left=retries_left,
    135 )
    136 return output_single[0]

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\prompt\pydantic_prompt.py:201, in PydanticPrompt.generate_multiple(self, llm, data, n, temperature, stop, callbacks, retries_left)
    199 output_string = resp.generations[0][i].text
    200 try:
--> 201     answer = await parser.parse_output_string(
    202         output_string=output_string,
    203         prompt_value=prompt_value,
    204         llm=llm,
    205         callbacks=prompt_cb,
    206         retries_left=retries_left,
    207     )
    208     processed_output = self.process_output(answer, data)  # type: ignore
    209     output_models.append(processed_output)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\ragas\prompt\pydantic_prompt.py:419, in RagasOutputParser.parse_output_string(self, output_string, prompt_value, llm, callbacks, retries_left)
    409     fixed_output_string = await fix_output_format_prompt.generate(
    410         llm=llm,
    411         data=OutputStringAndPrompt(
   (...)
    416         retries_left=retries_left - 1,
    417     )
    418     retry_rm.on_chain_end({"fixed_output_string": fixed_output_string})
--> 419     result = super().parse(fixed_output_string.text)
    420 else:
    421     raise RagasOutputParserException()

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\pydantic.py:83, in PydanticOutputParser.parse(self, text)
     74 def parse(self, text: str) -> TBaseModel:
     75     """Parse the output of an LLM call to a pydantic object.
     76 
     77     Args:
   (...)
     81         The parsed pydantic object.
     82     """
---> 83     return super().parse(text)

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\json.py:97, in JsonOutputParser.parse(self, text)
     88 def parse(self, text: str) -> Any:
     89     """Parse the output of an LLM call to a JSON object.
     90 
     91     Args:
   (...)
     95         The parsed JSON object.
     96     """
---> 97     return self.parse_result([Generation(text=text)])

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\pydantic.py:72, in PydanticOutputParser.parse_result(self, result, partial)
     70 if partial:
     71     return None
---> 72 raise e

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\pydantic.py:67, in PydanticOutputParser.parse_result(self, result, partial)
     54 """Parse the result of an LLM call to a pydantic object.
     55 
     56 Args:
   (...)
     64     The parsed pydantic object.
     65 """
     66 try:
---> 67     json_object = super().parse_result(result)
     68     return self._parse_obj(json_object)
     69 except OutputParserException as e:

File ~\Documents\Ragas2\Ragas\Lib\site-packages\langchain_core\output_parsers\json.py:86, in JsonOutputParser.parse_result(self, result, partial)
     84 except JSONDecodeError as e:
     85     msg = f"Invalid json output: {text}"
---> 86     raise OutputParserException(msg, llm_output=text) from e

OutputParserException: Invalid json output: {'reason': 'The summary accurately captures the main points of the given text.', 'verdict': 1}
For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/OUTPUT_PARSING_FAILURE 

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants