You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm facing a memory leak issue when using redis pubsub serving data for websocket clients.
My app uses gofiber (v2.33.0) for serving http, gofiber websocket (v2.0.21) for websocket connections, and go-redis (v8.11.5) serving as the pubsub message broker.
A python app publishes messages to a redis pubsub channel and the go app subscribes to the channel on a websocket endpoint, endlessly forwarding messages from redis to the websocket clients.
All works well except the fact that memory usage has been rising steadily, taking the form of a typical memory leak, as shown below:
New messages get published to redis pubsub channel within a certain time of the day (approx within 9AM - 3PM, GMT+6). Throughout the day, new clients connect to the websocket endpoint. A websocket connection is terminated when the go app fails to write a message (received from redis pubsub) to the websocket client.
Expected outcome: Memory utilization should reset to the base level (similar to when no connections existed) after discarding dead client connections. Actual outcome: Memory utilization peaks and drops, but at a higher point than the last reset level.
The following is a depiction of the websocket view function:
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Greetings.
I'm facing a memory leak issue when using redis pubsub serving data for websocket clients.
My app uses gofiber (v2.33.0) for serving http, gofiber websocket (v2.0.21) for websocket connections, and go-redis (v8.11.5) serving as the pubsub message broker.
A python app publishes messages to a redis pubsub channel and the go app subscribes to the channel on a websocket endpoint, endlessly forwarding messages from redis to the websocket clients.
All works well except the fact that memory usage has been rising steadily, taking the form of a typical memory leak, as shown below:
New messages get published to redis pubsub channel within a certain time of the day (approx within 9AM - 3PM, GMT+6). Throughout the day, new clients connect to the websocket endpoint. A websocket connection is terminated when the go app fails to write a message (received from redis pubsub) to the websocket client.
Expected outcome: Memory utilization should reset to the base level (similar to when no connections existed) after discarding dead client connections.
Actual outcome: Memory utilization peaks and drops, but at a higher point than the last reset level.
The following is a depiction of the websocket view function:
I am wrapping go-redis by the following custom object:
I tried to inspect the memory with pprof and found redis allocating and retaining the most memory:
I have considered freeing memory manually with
debug.FreeOSMemory()
but solutions online have suggested strongly against it.Is this (growing memory footprint) a feature of go-redis, or am I doing something horribly wrong? I'm very confused.
Any help or suggestions are highly welcome.
Beta Was this translation helpful? Give feedback.
All reactions