I am working on the third issue. In the translation end point of the Recommendation API, currently we fetch the wikidata_id from the respective wikipedia API and then we have a SparQL query that gets the data about the sitelink_counts and removes the disambiguation pages.
I have pushed a patch to get all the data in a single request from the respective wikipedia API. This removes the dependency on Wikidata Query Service and ensures that the API responds faster. Also after the previous patch the API was returning a 429 error as WDQS was blocking our requests which had been split into batches of 50. This patch solves that issue as well.