How to Solve Facebook Link Clicks VS Google Analytics Sessions Discrepancy

If you've ever worked with a nobjective: Traffic campaigns in Facebook, then you must have noticed the wide discrepancy between Facebook Link Clicks and your sessions in Google Analytics. Here's the search result for this topic: https://www.google.ae/search?q=facebook+link+clicks+vs+google+analytics+sessions&dcr=0&source=lnms&sa=X&ved=0ahUKEwiO7b6X9_rWAhVIuBoKHS7XDs8Q_AUICSgA&biw=1517&bih=681&dpr=0.9

Clicks and sessions metrics are different in the sense that Facebook would count a link click when the link is clicked on while GA would count it as a session when the GA tag loads. Even after factoring the difference in definition, multiple clicks from one person, closing site before page loads or failure of JS load on the website , the difference was usually very high with clicks being 2-3X number of sessions.

There are two solutions to address this:

A. Run conversion based campaign where the conversion event is a PageView. This will optimize the ads for this PageView from Facebook pixel. This cool idea was detailed in a post [at least when I stumbled upon it] by Vernon Johnson on 3Q Digital's blog. I really liked this approach as it added a layer of accountability for the ad [show ad to users who are likely to view the page]. 

https://3qdigital.com/blog/never-run-facebook-traffic-campaign#.WdKf1n6FabY.twitter

B. Run objective: traffic campaigns and optimize ad sets for landing page views. This is a relatively new feature in Facebook ads. 

The default option is link clicks [we'll deliver your ads to the right people to help you get the most link clicks from your ad to a destination]. Right under it, we have landing page views [we'll deliver your ads to people who are more likely to click on your ad's link AND load the landing page]. The second part is crucial. click and load the page. I think Facebook brought this bidding option to help address Facebook link clicks vs (Google) Analytics discrepancy. I'm currently running a traffic based campaign with this setting and the variance between these two metrics is very narrow - at least from early results. 

If you haven't tested this feature yet, give it a try and then compare the data from both platforms. Hopefully, this will help in your campaigns as well. 

Do you think there's a better option to reduce this discrepancy? Let me know via comments.