The website of our project is all written in angularjs, and the front and back ends exchange data through ajax. Therefore, the pages crawled down by the crawler are all blank. Moreover, a lot of data on the page can be displayed based on events, such as clicking to load comments, scrolling pages, etc. Google can run the js in the page, but Baidu cannot, but our SEO targets Baidu.
So what should SEO do for this kind of website?
Although prerender.io can render the page, more event-triggered data cannot be displayed, so this is not a good solution for us.
Making static pages for search engines alone, detecting UA requests, and returning the pages. Is this method considered a black hat SEO technique?
Or is there a better solution
See Google’s related documentation: Making AJAX Applications Crawlable
Create a static page for search engines separately, detect the requested UA, and return the page. Is this method considered a black hat SEO technique?
This is considered a black hat. It uses UA to judge and return to different pages. This allows users and search engines to see the difference. There are many people doing it. There should be some that survive.
Baidu may not be as good as GG in capturing data such as ajax. I saw a case in the past and it seems to do this. It will show the user information to fix the screen, but there will be text below it to explain the description, so The search engine can crawl the content to be expressed on the page and think it can be used as a reference.