Enter a URL
Whenever you search for something in Google you only see what ranks. You don't see the hundreds or thousands of pages that have been filtered for pushing too hard.
If you are in the dark on what density levels are reasonable, consider patterning your approach after what is working right now.
Keep in mind that some highly trusted brands rank more based on their brand strength than the on-page content, thus if you are creating content for a newer & less-trusted website you would likely be better off putting more weight on results from smaller & lesser known websites which still managed to rank well in Google.
There is no single optimal or universal keyword density percentage. Each search query is unique & search engines compare (or normalize) documents against other top documents to determine some of their specific thresholds. Some keywords like "credit cards" naturally appear as a two word phrase, whereas other terms may be more spread out. Further, some highly trusted websites with great awareness, strong usage data & robust link profiles can likely get away with more repetition than smaller, less trusted sites can.
As a general rule-of-thumb, when it comes to keyword frequency...
Lazy & uninformed cheap outsourced writing tends to be fairly repetitive - in part because people paid by the word to churn out cheap content have an incentive to bloat the word count, no incentive to trim the fat, and no incentive to do deep research. Google's leaked remote rater guidelines tells raters to rate low-information repetitive content poorly.
In this day and age the primary use of these types of analysis tools is not to keep dialing up the keyword density, but rather to lower the focus on the core terms while including alternate word forms, accronyms, synonyms & other supporting vocabulary.