“Now it is possible to index your dynamic content through Google“
The HTTP standard explains a number of request types, out of which GET is most popular. Browsers, for instance, use GET to retrieve a URL when you either type it in the address bar or click links etc. The POST request type differs from GET because it comes with a payload definition that is meant to be opened on the server for use with an application program. HTML forms also use POST to send text from input fields for processing on the server itself.
After early experimentation, search engines usually avoid placing input fields for making POST requests on their own. If a site is created with rich database content which is accessible through a site search engine field without easy discovery of links to its results pages, it can’t be indexed—even by today’s Googlebot.
Some POST Requests Now Work with Google
Google’s new Evergreen Googlebot can crawl and even index XHR POST requests but the question is whether it does or doesn’t was prompted by technical SEO Valentin Pletzer, who follows the Evergreen Googlebot quite closely. One must be aware of the crawlers that still do not possess this capacity.
What you need to note
When Google renders dynamic content which is driven by XHR POST request method, each sub-request will count against your overall crawl budget. The content from the POST event isn’t cached, which actually decrease your crawl budget by number of XHR requests to assemble the page. Suppose if you had a crawl budget of 100 pages, and your template for them used one XHR POST request each for content on the fly, then only 50 of your pages would be cached for use with Google’s search index.