Connecting Node takes 2 minutes

Hi, team.

When we try to SSH using Teleport to start sessions, it takes 2 minutes on average. After this, starting another sessions takes less than 1 seconds.
This happens when we haven’t connected to node server for dozens of minutes.
We cannot tolerate this for our development.

Our Teleport cluster is in HA mode on AWS. We have 3 proxies and 3 auth servers. Node servers connect to Proxy server using reverse tunnel.
Do you have any idea to solve this issue?

Our system versions and logs are below.


Versions

  • Version: Teleport Enterprise 4.3.4
  • OS: Amazon Linux 2 (4.14.186-146.268.amzn2.x86_64)

Logs

tsh ssh

$ date +"%Y/%m/%d %p %I:%M:%S" && tsh -d ssh teleport-node
2020/09/14 PM 06:06:40
INFO [CLIENT]    no host login given. defaulting to some.user client/api.go:811
INFO [CLIENT]    [KEY AGENT] Connected to the system agent: "/private/tmp/com.apple.launchd.G5DXtlwU5I/Listeners" client/api.go:2201
DEBU [KEYSTORE]  Returning SSH certificate "/Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com-cert.pub" valid until "2020-09-15 01:37:56 +0900 JST", TLS certificate "/Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com-x509.pem" valid until "2020-09-14 16:37:56 +0000 UTC". client/keystore.go:277
INFO [KEYAGENT]  Loading key for "some.user@example.com" client/keyagent.go:113
INFO [CLIENT]    Connecting proxy=proxy.bastion.example.com:3023 login='some.user' method=0 client/api.go:1633
DEBU [KEYAGENT]  Validated host proxy.bastion.example.com:3023. client/keyagent.go:285
INFO [CLIENT]    Successful auth with proxy proxy.bastion.example.com:3023 client/api.go:1623
DEBU [CLIENT]    Found clusters: [{"name":"teleport","lastconnected":"2020-09-14T08:53:58.31843409Z","status":"online"}] client/client.go:107
INFO [CLIENT]    Client= connecting to node=e-learning on cluster teleport client/client.go:539
DEBU [KEYAGENT]  Validated host e-learning:0@default@teleport. client/keyagent.go:285

       __|  __|_  )
       _|  (     /   Amazon Linux 2 AMI
      ___|\___|___|

https://aws.amazon.com/amazon-linux-2/
[some.user@ip-10-0-0-2 ~]$

Node logs (/var/log/messages)

Sep 14 06:08:49 ip-10-0-0-2  /usr/bin/teleport[3171]: INFO [NODE]      Creating (interactive) session 9a062832-767a-4bbd-b015-d29308b0ecdd. id:3095 local:10.0.0.2:57332 login:some.user remote:10.0.1.10:59606 teleportUser:some.user@example.com srv/sess.go:194
Sep 14 06:08:49 ip-10-0-0-2  /usr/bin/teleport[3171]: INFO [SESSION:N] New party ServerContext(10.0.1.10:59606->10.0.0.2:57332, user=some.user, id=3095) party(id=df528654-a105-4c5d-a3ec-d097cbf2b653) joined session: 9a062832-767a-4bbd-b015-d29308b0ecdd srv/sess.go:1080

It takes 129 seconds.
(From 2020/09/14 PM 06:06:40 in tsh ssh log to Sep 14 06:08:49 ip-10-0-0-2 in Node logs)

ssh (NOT same time with above)

$ ssh -v teleport-node
OpenSSH_8.1p1, LibreSSL 2.7.3
debug1: Reading configuration data /Users/some.user/.ssh/config
debug1: /Users/some.user/.ssh/config line 35: Applying options for teleport-node
debug1: /Users/some.user/.ssh/config line 42: Applying options for *
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 47: Applying options for *
debug1: Setting implicit ProxyCommand from ProxyJump: ssh -v -W '[%h]:%p' teleport-proxy
debug1: Executing proxy command: exec ssh -v -W '[teleport-node]:3022' teleport-proxy
debug1: identity file /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com type 0
debug1: identity file /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com-cert type 4
debug1: Local version string SSH-2.0-OpenSSH_8.1
OpenSSH_8.1p1, LibreSSL 2.7.3
debug1: Reading configuration data /Users/some.user/.ssh/config
debug1: /Users/some.user/.ssh/config line 21: Applying options for teleport-proxy
debug1: /Users/some.user/.ssh/config line 42: Applying options for *
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 47: Applying options for *
debug1: Connecting to proxy.bastion.example.com port 3023.
debug1: Connection established.
debug1: identity file /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com type 0
debug1: identity file /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com-cert type 4
debug1: Local version string SSH-2.0-OpenSSH_8.1
debug1: Remote protocol version 2.0, remote software version Teleport
debug1: no match: Teleport
debug1: Authenticating to proxy.bastion.example.com:3023 as 'some.user'
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug1: kex: algorithm: curve25519-sha256@libssh.org
debug1: kex: host key algorithm: ssh-rsa-cert-v01@openssh.com
debug1: kex: server->client cipher: chacha20-poly1305@openssh.com MAC: <implicit> compression: none
debug1: kex: client->server cipher: chacha20-poly1305@openssh.com MAC: <implicit> compression: none
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host certificate: ssh-rsa-cert-v01@openssh.com SHA256:R9kbZg3v2+b9bnA3HJYY2i8eG3qr92yNBMbeDPOJw6k, serial 0 ID "" CA ssh-rsa SHA256:NsRRCIEfGVIInXd8QCi9EN2njUjD4gdnZPzk9BiaRkY valid after 2020-08-21T11:27:50
debug1: checking without port identifier
debug1: Host 'proxy.bastion.example.com' is known and matches the RSA-CERT host certificate.
debug1: Found CA key in /Users/some.user/.ssh/known_hosts:1
debug1: found matching key w/out port
debug1: rekey out after 134217728 blocks
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug1: SSH2_MSG_NEWKEYS received
debug1: rekey in after 134217728 blocks
debug1: Will attempt key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA-CERT SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: Will attempt key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug1: Authentications that can continue: publickey
debug1: Next authentication method: publickey
debug1: Offering public key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA-CERT SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: Server accepts key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA-CERT SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: Authentication succeeded (publickey).
Authenticated to proxy.bastion.example.com ([10.0.1.10]:3023).
debug1: channel_connect_stdio_fwd teleport-node:3022
debug1: channel 0: new [stdio-forward]
debug1: getpeername failed: Bad file descriptor
debug1: Entering interactive session.
debug1: pledge: network
debug1: Remote protocol version 2.0, remote software version Teleport
debug1: no match: Teleport
debug1: Authenticating to teleport-node:3022 as 'some.user'
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug1: kex: algorithm: curve25519-sha256@libssh.org
debug1: kex: host key algorithm: ssh-rsa-cert-v01@openssh.com
debug1: kex: server->client cipher: chacha20-poly1305@openssh.com MAC: <implicit> compression: none
debug1: kex: client->server cipher: chacha20-poly1305@openssh.com MAC: <implicit> compression: none
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host certificate: ssh-rsa-cert-v01@openssh.com SHA256:FuC24ByOn3u5cBl+pvtgljGMyU2MYa33p91xC0tBYDk, serial 0 ID "" CA ssh-rsa SHA256:NsRRCIEfGVIInXd8QCi9EN2njUjD4gdnZPzk9BiaRkY valid after 2020-08-21T15:57:56
debug1: checking without port identifier
debug1: No matching CA found. Retry with plain key
debug1: No matching CA found. Retry with plain key
debug1: Host '[teleport-node]:3022' is known and matches the RSA host key.
debug1: Found key in /Users/some.user/.ssh/known_hosts:9
debug1: rekey out after 134217728 blocks
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug1: SSH2_MSG_NEWKEYS received
debug1: rekey in after 134217728 blocks
debug1: Will attempt key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA-CERT SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: Will attempt key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug1: Authentications that can continue: publickey
debug1: Next authentication method: publickey
debug1: Offering public key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA-CERT SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: Server accepts key: /Users/some.user/.tsh/keys/bastion.example.com/some.user@example.com RSA-CERT SHA256:pN2yuPhndC0xsDCemrvNeucT0QpNKac4Lpi4MZmQ4hU explicit agent
debug1: Authentication succeeded (publickey).
Authenticated to teleport-node (via proxy).
debug1: channel 0: new [client-session]
debug1: Entering interactive session.
debug1: pledge: proc
debug1: Requesting authentication agent forwarding.
debug1: Sending environment.
debug1: Sending env LC_TERMINAL_VERSION = 3.3.11
debug1: Sending env LANG = ja_JP.UTF-8
debug1: Sending env LC_TERMINAL = iTerm2
debug1: client_input_global_request: rtype keepalive@openssh.com want_reply 1
debug1: client_input_global_request: rtype keepalive@openssh.com want_reply 1
debug1: client_input_global_request: rtype keepalive@openssh.com want_reply 1
debug1: client_input_global_request: rtype keepalive@openssh.com want_reply 1

       __|  __|_  )
       _|  (     /   Amazon Linux 2 AMI
      ___|\___|___|

https://aws.amazon.com/amazon-linux-2/
[some.user@ip-10.0.0.2 ~]$

Best regards.

So I think what you’re saying is that connecting via ssh is much quicker than connecting via tsh? Or is it just that after the first connection is established (by any method), subsequent connections are quicker?

I can certainly see how a delay would be frustrating. Unfortunately, the lack of timestamps in the tsh ssh log and the low level of logging on the node means that unfortunately, there’s not a lot of useful information for diagnosing the problem contained in what you’ve provided so far.

Can you get logs from the Teleport proxy (proxy.bastion.example.com) for the time period in question? If they’re also only at INFO level, I’d ask if you could please turn the log level on the proxy up to DEBUG by modifying your Teleport config file (/etc/teleport.yaml by default) like this and restarting Teleport:

teleport:
  log:
    output: syslog
    severity: DEBUG

If we can get DEBUG level logs from the proxy server from around the time that this issue is occurring, we should be able to assist further.

One other thing I will say is that I notice you’re deployed in an HA mode on AWS. Are you using DynamoDB for your database storage? We had another customer that had a similar issue with slow tsh login/tsh ssh commands - the issue turned out to be that their DynamoDB read/write limits were set too low and it was causing throttling/backoff when they tried to read/write data from the tables. Just another potential avenue to investigate, given that the issue only occurs when you haven’t logged in for a while (so I would expect a flurry of activity at the time you do log in, with less activity for subsequent logins)

1 Like

Thank you for your reply.

So I think what you’re saying is that connecting via ssh is much quicker than connecting via tsh ? Or is it just that after the first connection is established (by any method), subsequent connections are quicker?

The latter is correct. I apologize for any confusion.

If we can get DEBUG level logs from the proxy server from around the time that this issue is occurring, we should be able to assist further.

Log level of our configuration was INFO. Unfortunately, though I changed severity value to DEBUG as you mentioned and restarted Teleport by executing $ sudo systemctl reload teleport-node, nothing has changed.

the issue turned out to be that their DynamoDB read/write limits were set too low and it was causing throttling/backoff when they tried to read/write data from the tables.

I believe this was one of the reason of this issue. Our issue came better by increasing read/write limit.
The number of RCU/WCU on the Dynamo DB were both 5 and it seemed too less. So I changed the capacity mode to On-Demand from Provisioned to increase read/write limit. After that, the latency of connection time was shortened to 36 sec from 130 sec on average. But this isn’t a fundamental solution.

Interestingly, I found that connection will be established immediately by trying to establish new connection using ssh command while connection is delaying by executing tsh ssh or ssh command.
Additionally, I turned CloudWatch Contributor Insights on to identify dispersion for the number of access of each items on Dynamo DB. The result shows that the number of access to items for authorities are larger than others.

I hope these will help your support.

Could you give us some more your ideas for this issue if you have?

You should see an increased level of logging wherever your Teleport logs are being output - /var/log/messages in your case. The logs at this level from the Teleport auth/proxy server (starting when you try to connect to a node, and stopping when the connection has successfully started) could potentially help us diagnose an issue.

We certainly recommend the use of provisioned units. What are the read/write limits set to on your cluster? I can see from your Cloudwatch logs that there is a sustained throughput of ~25 units, so the limits would need to be set high enough to allow that.

This is mostly expected, as these keys contain Teleport’s host and user certificate authorities, which are used for the majority of Teleport operations.

I found that debug logs are not in /var/log/messages but journald.
Below are the logs on Node when it took 130 seconds to establish SSH connection.

Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Transport request: teleport-transport. leaseID:1 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:434
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Received out-of-band proxy transport request for @local-node []. leaseID:1 target:proxy.bastion.example.com:3024 reversetunnel/transport.go:209
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      conn(3.114.9.122:36440->192.168.0.23:34872, user=root) auth attempt fingerprint:ssh-rsa-cert-v01@openssh.com SHA256:Bm1PHxbWkJfoobarJOkPbQVxzHXxMMV2Ox+Dno5juGg local:192.168.0.23:34872 remote:3.114.9.122:36440 user:root srv/authhandlers.go:151
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      conn(3.114.9.122:36440->192.168.0.23:34872, user=root) auth attempt with key ssh-rsa-cert-v01@openssh.com SHA256:Bm1PHxbWkJfoobarJOkPbQVxzHXxMMV2Ox+Dno5juGg, &ssh.Certificate{Nonce:[]uint8{0x1f, 0xa4, 0x51, 0x32, 0x6d, 0x79, 0x10, 0x8, 0xd6, 0xde, 0xea, 0x77, 0x8e, 0xd5, 0x58, 0x79, 0x68, 0x4d, 0x88, 0x14, 0xb9, 0xda, 0xd7, 0x9b, 0xe4, 0xda, 0x72, 0x2e, 0xa, 0x6a, 0xaf, 0x6c}, Key:(*ssh.rsaPublicKey)(0xc000ab1c40), Serial:0x0, CertType:0x1, KeyId:"ssh-healthcheck", ValidPrincipals:[]string{"root"}, ValidAfter:0x5efd7fad, ValidBefore:0x71c982e9, Permissions:ssh.Permissions{CriticalOptions:map[string]string{}, Extensions:map[string]string{"permit-agent-forwarding":"", "permit-port-forwarding":"", "permit-pty":"", "teleport-roles":"{\"version\":\"v1\",\"roles\":[\"admin\"]}", "teleport-traits":"{\"kubernetes_groups\":null,\"logins\":null}"}}, Reserved:[]uint8{}, SignatureKey:(*ssh.rsaPublicKey)(0xc000ab1c80), Signature:(*ssh.Signature)(0xc000a97380)} fingerprint:ssh-rsa-cert-v01@openssh.com SHA256:Bm1PHxbWkJfoobarJOkPbQVxzHXxMMV2Ox+Dno5juGg local:192.168.0.23:34872 remote:3.114.9.122:36440 user:root srv/authhandlers.go:151
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      Successfully authenticated fingerprint:ssh-rsa-cert-v01@openssh.com SHA256:Bm1PHxbWkJfoobarJOkPbQVxzHXxMMV2Ox+Dno5juGg local:192.168.0.23:34872 remote:3.114.9.122:36440 user:root srv/authhandlers.go:196
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      Checking permissions for (ssh-healthcheck,root) to login to node with RBAC checks. fingerprint:ssh-rsa-cert-v01@openssh.com SHA256:Bm1PHxbWkJfoobarJOkPbQVxzHXxMMV2Ox+Dno5juGg local:192.168.0.23:34872 remote:3.114.9.122:36440 user:root srv/authhandlers.go:323
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [SSH:NODE]  Incoming connection 3.114.9.122:36440 -> 192.168.0.23:34872 vesion: SSH-2.0-paramiko_2.7.1. sshutils/server.go:430
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [KEEPALIVE] Starting keep-alive loop with with interval 1m0s and max count 3. srv/keepalive.go:67
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      Handling request exec, want reply true. id:1185 local:192.168.0.23:34872 login:root remote:3.114.9.122:36440 teleportUser:ssh-healthcheck regular/sshserver.go:1140
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [HTTP:PROX] No valid environment variables found. proxy/proxy.go:222
Sep 25 07:57:30 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [HTTP:PROX] No proxy set in environment, returning direct dialer. proxy/proxy.go:137
Sep 25 07:57:31 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:57:36 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:57:41 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:57:46 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:57:51 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:57:56 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:01 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:06 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:telepo
rt reversetunnel/agentpool.go:387
Sep 25 07:58:06 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Ping -> 52.192.203.55:3024. leaseID:1 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:415
Sep 25 07:58:06 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Ping -> 13.114.3.1:3024. leaseID:2 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:415
Sep 25 07:58:07 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Ping -> 18.177.2.20:3024. leaseID:3 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:415
Sep 25 07:58:11 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:16 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:21 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:26 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:31 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: INFO [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:367
Sep 25 07:58:36 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:41 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:46 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:51 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:58:56 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:01 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:06 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:06 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Ping -> 52.192.203.55:3024. leaseID:1 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:415
Sep 25 07:59:06 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Ping -> 13.114.3.1:3024. leaseID:2 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:415
Sep 25 07:59:07 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Ping -> 18.177.2.20:3024. leaseID:3 target:proxy.bastion.example.com:3024 reversetunnel/agent.go:415
Sep 25 07:59:11 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:16 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:21 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:26 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:31 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:36 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:40 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: INFO [NODE]      Creating (exec) session a3e0cb9c-63a8-4ecc-8843-b79523725704. id:1185 local:192.168.0.23:34872 login:root remote:3.114.9.122:36440 teleportUser:ssh-healthcheck srv/sess.go:217
Sep 25 07:59:40 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: INFO [NODE]      Started local command execution: "logger \"$(date) | SSH Healthcheck from lambda\"" id:1185 local:192.168.0.23:34872 login:root remote:3.114.9.122:36440 teleportUser:ssh-healthcheck srv/exec.go:183
Sep 25 07:59:40 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: INFO [SESSION:N] Closing session a3e0cb9c-63a8-4ecc-8843-b79523725704 srv/sess.go:573
Sep 25 07:59:40 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [SSH:NODE]  Closed connection 3.114.9.122:36440. sshutils/server.go:432
Sep 25 07:59:40 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      Client 3.114.9.122:36440 disconnected. id:1185 local:192.168.0.23:34872 login:root remote:3.114.9.122:36440 teleportUser:ssh-healthcheck regular/sshserver.go:1104
Sep 25 07:59:41 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [NODE]      Local command successfully executed. id:1185 local:192.168.0.23:34872 login:root remote:3.114.9.122:36440 teleportUser:ssh-healthcheck srv/exec.go:197
Sep 25 07:59:41 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:46 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [PROXY:AGE] Outbound tunnel stats. stats:map[connected:3 connecting:0 disconnected:0] target:teleport cluster:teleport reversetunnel/agentpool.go:387
Sep 25 07:59:47 ip-192-168-0-23.ap-northeast-1.compute.internal /usr/bin/teleport[14090]: DEBU [AUDIT]     Session upload completed. duration:125.567146ms session-id:a3e0cb9c-63a8-4ecc-8843-b79523725704 events/uploader.go:257

I changed our R/W capacity mode to Provisioned and increased number of capacity units from 100 to 1000.
However, the latency of connection time went back to 130 seconds like shown in logs above.

Could you give us some more your ideas from logs above?

Thanks.