Is there anyway we can set Incremental crawl not to delete previous items when the gatherer does not found them in previous incremental crawl.
The reason being is we have a BDC with timestamp to crawl external content source which is a huge SQL table with millions of records and when we just set normal incremental crawl for this table, it always stuck at one million plus records. This table is growing daily with average of 11,000 records/day. As an alternative we configured the BDC to filter today's records only for crawling but the incremental crawl always deletes away the records that have indexed yesterday.
We have tried ChangeLog-Based crawl by following the steps in msdn blog, which seems everything is fine until we realized that incremental crawls does not populate crawled properties and also managed properties which are mapped to them.
We would appreciate anyone who could help us or provide us some directions on this. Is there any workaround?
- Edited by mark.tz Wednesday, December 18, 2013 8:10 AM