-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batched conversion of cascaded biquads to high-order filter #5
Comments
Hi @SuperKogito, Just transpose dims 0 and 1 so it becomes |
Unfortunately this fails at the conv1d bx
Out[19]:
tensor([[[ 1.0000, -1.6978, 0.7266],
[ 1.0000, -1.8332, 0.8801]],
[[ 1.0000, -1.6978, 0.7266],
[ 1.0000, -1.8332, 0.8801]],
[[ 1.0000, -1.6978, 0.7266],
[ 1.0000, -1.8332, 0.8801]],
[[ 1.0000, -1.6978, 0.7266],
[ 1.0000, -1.8332, 0.8801]]])
def coeff_product(polynomials):
n = len(polynomials)
if n == 1:
return polynomials
c1 = coeff_product(polynomials[n // 2 :])
c2 = coeff_product(polynomials[: n // 2])
if c1.shape[1] > c2.shape[1]:
c1, c2 = c2, c1
weight = c1.unsqueeze(1).flip(2)
prod = F.conv1d(
c2.unsqueeze(0),
weight,
padding=weight.shape[2] - 1,
groups=c2.shape[0],
).squeeze(0)
return prod
bx.shape
Out[21]: torch.Size([4, 2, 3])
bx.transpose(0, 1).shape
Out[22]: torch.Size([2, 4, 3])
coeff_product(bx.transpose(0, 1))
Traceback (most recent call last):
File "D:\Users\am\AppData\Local\Temp\ipykernel_23352\2398578349.py", line 1, in <module>
coeff_product(bx.transpose(0, 1))
File "D:\Users\am\AppData\Local\Temp\ipykernel_23352\657916472.py", line 11, in coeff_product
prod = F.conv1d(
RuntimeError: Expected 2D (unbatched) or 3D (batched) input to conv1d, but got input of size: [1, 1, 4, 3] when using prod = F.conv1d(
c2,
weight,
padding=weight.shape[2] - 1,
groups=c2.shape[0],
).squeeze(0) The code executes but the result does not have the right shape nor the correct values. |
It works on my side.
|
Which pytorch and torchaudio versions are you using ? |
Is there some documentation of this function / the math behind it? https://github.com/yoyololicon/golf/blob/52f50e7341f769d49e6bddbbe887c149c2b9a413/models/utils.py#L444-L460
I am trying to extend it to work on a batched input (num_batches, num_sections, num_biquad_coeffs=6) and I am not sure how to proceed.
The text was updated successfully, but these errors were encountered: