Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setup.py fatal error: 'processor/wetext_processor.h' file not found #2032

Closed
Runtrons opened this issue Sep 28, 2023 · 10 comments
Closed

Setup.py fatal error: 'processor/wetext_processor.h' file not found #2032

Runtrons opened this issue Sep 28, 2023 · 10 comments

Comments

@Runtrons
Copy link

Runtrons commented Sep 28, 2023

Hi, I haven't had any success running this so far. I have gotten this error every time I try to run anything using the wenetruntime python package:

libc++abi: terminating due to uncaught exception of type c10::Error: Type c10::intrusive_ptr<ConvPackedParamsBase<2>, c10::detail::intrusive_target_default_null_type<ConvPackedParamsBase<2> > > could not be converted to any of the known types.
Exception raised from operator() at /Users/runner/work/pytorch/pytorch/pytorch/aten/src/ATen/core/jit_type.h:1756 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 98 (0x11fa8c992 in libc10.dylib)
frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::__1::basic_string<char, std::__1::char_traits, std::_1::allocator> const&) + 106 (0x11fa8b0aa in libc10.dylib)
frame #2: c10::detail::getTypePtr
<c10::intrusive_ptr<ConvPackedParamsBase<2>, c10::detail::intrusive_target_default_null_type<ConvPackedParamsBase<2>>>>::call()::'lambda'()::operator()() const + 275 (0x15581e233 in libtorch_cpu.dylib)
frame #3: c10::Type::SingletonOrSharedTypePtrc10::Type c10::getFakeTypePtrCopy<c10::intrusive_ptr<ConvPackedParamsBase<2>, c10::detail::intrusive_target_default_null_type<ConvPackedParamsBase<2>>>>() + 25 (0x15581dfc9 in libtorch_cpu.dylib)
frame #4: c10::detail::infer_schema::(anonymous namespace)::createArgumentVector(c10::ArrayRefc10::detail::infer_schema::ArgumentDef) + 188 (0x15509197c in libtorch_cpu.dylib)
frame #5: c10::detail::infer_schema::make_function_schema(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>&&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>&&, c10::ArrayRefc10::detail::infer_schema::ArgumentDef, c10::ArrayRefc10::detail::infer_schema::ArgumentDef) + 123 (0x15509174b in libtorch_cpu.dylib)
frame #6: c10::detail::infer_schema::make_function_schema(c10::ArrayRefc10::detail::infer_schema::ArgumentDef, c10::ArrayRefc10::detail::infer_schema::ArgumentDef) + 76 (0x155091d6c in libtorch_cpu.dylib)
frame #7: std::__1::unique_ptr<c10::FunctionSchema, std::__1::default_deletec10::FunctionSchema> c10::detail::inferFunctionSchemaFromFunctor<at::Tensor (*)(at::Tensor, c10::intrusive_ptr<ConvPackedParamsBase<2>, c10::detail::intrusive_target_default_null_type<ConvPackedParamsBase<2>>> const&, double, long long)>() + 180 (0x15587ade4 in libtorch_cpu.dylib)

**And the error goes on and on. I decided to run from source so I tried:

  1. git clone https://github.com/wenet-e2e/wenet.git
  2. cd wenet/runtime/binding/python
  3. python setup.py install

And when I do so it fails after countless warnings with this fatal error:**

[ 69%] Built target kaldi-decoder #Was working
[ 74%] Built target frontend #Was working
[ 76%] Building CXX object post_processor/CMakeFiles/post_processor.dir/post_processor.cc.o#Was working
In file included from /Users/ronangrant/Safe/wenet2/wenet/runtime/binding/python/post_processor/post_processor.cc:16: #Error
/Users/ronangrant/Safe/wenet2/wenet/runtime/binding/python/post_processor/post_processor.h:22:10: fatal error: 'processor/wetext_processor.h' file not found
#include "processor/wetext_processor.h"
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1 error generated.
make[3]: *** [post_processor/CMakeFiles/post_processor.dir/post_processor.cc.o] Error 1
make[2]: *** [post_processor/CMakeFiles/post_processor.dir/all] Error 2
make[1]: *** [CMakeFiles/_wenet.dir/rule] Error 2
make: *** [_wenet] Error 2

......

File "setup.py", line 38, in build_extension
raise Exception(
Exception:
Build wenet failed. Please check the error message.
You can ask for help by creating an issue on GitHub.

**I am using MacOs on an intel Mac. I have tried on python 3.10. 3.9 and 3.8. The error above was python 3.8.

If you could help I would really appreciate it**

Screenshot 2023-09-28 at 3 06 49 PM
@kuruvachankgeorge
Copy link

Creating a soft link of processor dir should solve this. Run the below script before running python setup.py install
ln -s ../../../runtime/libtorch/fc_base/wetextprocessing-src/runtime/processor

@Runtrons
Copy link
Author

Runtrons commented Sep 29, 2023

@kuruvachankgeorge I do not see /runtime/libtorch/fc_base/wetextprocessing-src/runtime/processor directory in the repo. The only fc_base directory I found was in wenet/runtime/binding/python/fc_base. am I missing something? Thank you so much for your help.

@kuruvachankgeorge
Copy link

kuruvachankgeorge commented Sep 29, 2023

@kuruvachankgeorge I do not see /runtime/libtorch/fc_base/wetextprocessing-src/runtime/processor directory in the repo. The only fc_base directory I found was in wenet/runtime/binding/python/fc_base. am I missing something? Thank you so much for your help.

  1. git clone https://github.com/wenet-e2e/wenet.git
  2. cd wenet/runtime/binding/python
  3. ln -s ../../../runtime/libtorch/fc_base/wetextprocessing-src/runtime/processor
  4. python setup.py install

@Runtrons
Copy link
Author

@kuruvachankgeorge Thanks so much for helping me it means a lot. There still isn't a directory fc_base that I see inside of libtorch as you can see here:
image

Could that be the problem? Am I doing something wrong? Thanks

@kuruvachankgeorge
Copy link

kuruvachankgeorge commented Sep 29, 2023

I guess you didn't build the runtime in libtorch. So you can follow these steps:

  1. git clone https://github.com/wenet-e2e/wenet.git
  2. cd runtime/libtorch
  3. mkdir build && cd build && cmake -DGRAPH_TOOLS=ON .. && cmake --build .
  4. cd wenet/runtime/binding/python
  5. ln -s ../../../runtime/libtorch/fc_base/wetextprocessing-src/runtime/processor
  6. python setup.py install

Note: runtime build requires cmake 3.14 or above (conda install -c anaconda cmake)

@xingchensong
Copy link
Member

thank u for pointing it out. this is partly caused by the intergration of itn(wetextprocessing) runtime in pr
#2001, where we only change the cmakelist for runtime/libtorch, but forget to modify cmakelist for runtime/bingding/python accordingly. u can just do similar modifications in binding/python, once u succesed, whelcom to submit a PR.

@Runtrons
Copy link
Author

@kuruvachankgeorge That worked! I experienced another error though with python setup.py install:

Screenshot 2023-09-28 at 10 13 18 PM

Undefined symbols for architecture x86_64:
"wetext::Processor::Normalize(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&)", referenced from:
wenet::PostProcessor::Process(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, bool) in libpost_processor.a(post_processor.cc.o)
"wetext::Processor::Processor(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&)", referenced from:
wenet::PostProcessor::InitITNResource(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) in libpost_processor.a(post_processor.cc.o)
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)

make[3]: *** [api/libwenet_api.dylib] Error 1
make[2]: *** [api/CMakeFiles/wenet_api.dir/all] Error 2
make[1]: *** [CMakeFiles/_wenet.dir/rule] Error 2
make: *** [_wenet] Error 2
Traceback (most recent call last):
File "setup.py", line 65, in
........

File "setup.py", line 38, in build_extension
raise Exception(
Exception:
Build wenet failed. Please check the error message.
You can ask for help by creating an issue on GitHub.

It appears clang: error: linker command failed with exit code 1 (use -v to see invocation)

Would you mind helping me with this as well? I could not have done this without you thank you so much!

@kuruvachankgeorge
Copy link

@kuruvachankgeorge That worked! I experienced another error though with python setup.py install:

Screenshot 2023-09-28 at 10 13 18 PM Undefined symbols for architecture x86_64: "wetext::Processor::Normalize(std::__1::basic_string const&)", referenced from: wenet::PostProcessor::Process(std::__1::basic_string const&, bool) in libpost_processor.a(post_processor.cc.o) "wetext::Processor::Processor(std::__1::basic_string const&, std::__1::basic_string const&)", referenced from: wenet::PostProcessor::InitITNResource(std::__1::basic_string const&, std::__1::basic_string const&) in libpost_processor.a(post_processor.cc.o) **ld: symbol(s) not found for architecture x86_64 clang: error: linker command failed with exit code 1 (use -v to see invocation)** make[3]: *** [api/libwenet_api.dylib] Error 1 make[2]: *** [api/CMakeFiles/wenet_api.dir/all] Error 2 make[1]: *** [CMakeFiles/_wenet.dir/rule] Error 2 make: *** [_wenet] Error 2 Traceback (most recent call last): File "setup.py", line 65, in ........

File "setup.py", line 38, in build_extension raise Exception( Exception: Build wenet failed. Please check the error message. You can ask for help by creating an issue on GitHub.

It appears clang: error: linker command failed with exit code 1 (use -v to see invocation)

Would you mind helping me with this as well? I could not have done this without you thank you so much!

Can you check the build instructions for iOS https://github.com/wenet-e2e/wenet/tree/main/runtime/ios

@20070951
Copy link

I guess you didn't build the runtime in libtorch. So you can follow these steps:

  1. git clone https://github.com/wenet-e2e/wenet.git
  2. cd runtime/libtorch
  3. mkdir build && cd build && cmake -DGRAPH_TOOLS=ON .. && cmake --build .
  4. cd wenet/runtime/binding/python
  5. ln -s ../../../runtime/libtorch/fc_base/wetextprocessing-src/runtime/processor
  6. python setup.py install

Note: runtime build requires cmake 3.14 or above (conda install -c anaconda cmake)

Worked! Thank you for your advice.

@xingchensong
Copy link
Member

@kuruvachankgeorge @20070951 ,could u try this fix ? #2042

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants