Issues with Mullvad VPN

blinq

New member
Matchmaking doesn't work with Mullvad VPN enabled.

Arch Linux with Kernel: Linux 6.10.5-arch1-1
Wayland on GDM. RTX 4060 mobile and Ryzen 9 7940HS.
 

Attachments

  • Screenshot from 2024-08-16 18-36-28.png
    Screenshot from 2024-08-16 18-36-28.png
    207.4 KB · Views: 14
Whats the configuration for allowed ips and is shadowsocks on?
I doubt that shadowsocks is on since I’m using it on Tailscale, everything should be allowed. Let me just check again with only Tailscale enabled, without the Mullvad exit-node.

Edit:

So I've just tested and I can confirm that Tailscale does not affect this at all, the culprit is the Mullvad exit-node, I also copied the output from the console:

"
[Citadel Play Controller] Network location: hel=151+15,sto=/158+15,sto2=/158+15,ams=/178+15,par=/184+15,fra=/185+15,lhr=/186+15,mad=/200+15,waw=/201+15,vie=/211+15,iad=/268+15,ord=/270+15
[GCClient] Send msg 9010 (k_EMsgClientToGCStartMatchmaking), 80 bytes
**** Unable to localize '#Citadel_Dashboard_Matchmaking_SearchingForBotMatch' on panel 'HudAndDBOverlay'
[GCClient] Recv msg 9011 (k_EMsgClientToGCStartMatchmakingResponse), 19 bytes
**** Unable to localize '#GenericConfirmText_Label' on panel descendant of 'MMError'
**** Unable to localize '#Citadel_StartMatchmaking_NoRegionPings' on panel 'MessageLabel'
Job CCitadel_PlayController::InternalFindMatch has spent >3.000ms without yielding: 4.247ms
[WorldRenderer] CWorldRendererMgr::ServiceWorldRequests long frame: 850.792522ms
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 10 allocations
Warning: have oustanding per-frame memory stack with 1 allocations
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 24 allocations
Warning: have 6 in use and 0 free per-frame memory stacks outstanding. The oldest stack was created at present 1, and we're now on present 2283
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 10 allocations
Warning: have oustanding per-frame memory stack with 1 allocations
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 24 allocations
Warning: have 6 in use and 0 free per-frame memory stacks outstanding. The oldest stack was created at present 1, and we're now on present 2284
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 10 allocations
Warning: have oustanding per-frame memory stack with 1 allocations
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 24 allocations
Warning: have 6 in use and 0 free per-frame memory stacks outstanding. The oldest stack was created at present 1, and we're now on present 2285
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 10 allocations
Warning: have oustanding per-frame memory stack with 1 allocations
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 24 allocations
Warning: have 6 in use and 0 free per-frame memory stacks outstanding. The oldest stack was created at present 1, and we're now on present 2286
Warning: have oustanding per-frame memory stack with 14 allocations
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 10 allocations
Warning: have oustanding per-frame memory stack with 1 allocations
Warning: have oustanding per-frame memory stack with 16 allocations
Warning: have oustanding per-frame memory stack with 24 allocations
Warning: have 6 in use and 0 free per-frame memory stacks outstanding. The oldest stack was created at present 1, and we're now on present 2287"
 
Last edited:
Does matchmaking work with the VPN disabled? Increasing MTU size won't fix this if it is already fragmenting the packets under the VPN.

You will need to test your connection using ping. You would set the ping up like "ping -c 4 -M do -s 1472 google.com" - with this I am pinging google.com with 1472 bytes of data 4 times. You would need to test using this method to find the largest packet that be sent without fragmentation, and then add 28 bytes to it for IP/ICMP headers. So if 1472 is the highest you can go you would set your MTU to 1500.
 
Does matchmaking work with the VPN disabled? Increasing MTU size won't fix this if it is already fragmenting the packets under the VPN.

You will need to test your connection using ping. You would set the ping up like "ping -c 4 -M do -s 1472 google.com" - with this I am pinging google.com with 1472 bytes of data 4 times. You would need to test using this method to find the largest packet that be sent without fragmentation, and then add 28 bytes to it for IP/ICMP headers. So if 1472 is the highest you can go you would set your MTU to 1500.
MM works without vpn
 
[Citadel Play Controller] Network location: hel=151+15,sto=/158+15,sto2=/158+15,ams=/178+15,par=/184+15,fra=/185+15,lhr=/186+15,mad=/200+15,waw=/201+15,vie=/211+15,iad=/268+15,ord=/270+15
Those show that you have bad response time everywhere on EU servers
 
[Citadel Play Controller] Network location: hel=151+15,sto=/158+15,sto2=/158+15,ams=/178+15,par=/184+15,fra=/185+15,lhr=/186+15,mad=/200+15,waw=/201+15,vie=/211+15,iad=/268+15,ord=/270+15
Those show that you have bad response time everywhere on EU servers
It shouldn’t affect it at all since I’m using a node in my home country. Nothing else is affected by the VPN, this is the first time I’m facing such an issue.

I will test shortly your MTU fix alongside charlixcxstan123’s ping.
 
Back
Top