Auto populate a lookup field with multiple entries after selecting multiple entries from a lookup

Please excuse me for the vaugness, required due to the type of work.

I have a list that indicates upcoming work on various items. When the work is finished on those items, drawings need updated. As it is, I have the items showing the date that the work is finished and a lookup that lists the various work numbers being performed on that item. The work being performed affects various drawings, so I have another lookup that looks up all the drawings. I want the first lookup for the work numbers to cascade to the drawings lookup, so that when I select the work numbers, only the drawings that are affected will auto populate. I do not have access to InfoPath or SPD (my company restricted that). So, all I can do is put script in a webpart. Any ideas?

I have found script that works when the first lookup is just one item, it will then show only the items that apply in the next lookup. I can’t seem to find anything that will let me select multiple items in the first lookup and display all applicable items in the second.

How to use LookUp column?

I have one custom SharePoint list. In the list I have 4 columns:

  • Store No
  • Store Name
  • Item No
  • Item Name

I want to make one child list in such way that the Item Name and the Store Name columns in the parent list are populated with values from the child list based on the values of the Store No and the Item No columns in the parent list.

Lookup column output includes strange characters

Question Sign in to vote 0 Sign in to vote Hi all,

I have a lookup column (“Title”), that returns as text, which it looks-up from a different library (obviously).

I have a workflow that emails the user when the list item is created, and in that email, I have included the Title field. However, the email is showing the contents of the Title field along with other characters – an example is:

The column in the list says

The column in the email says 3;#3

My environment is SharePoint 2013 and I am using a SharePoint 2010 workflow.

Why is it adding the additional characters to it?

Thanks.

Why I can’t get Lookup Column ID from Source List by Flow?

I have a question about Flow for Sharepoint List. I have 2 Lists, and one of them has a lookup column to another.
List 1: Currency List (Target List) enter image description here

List 2: Price Table (Source List). I want to get a Value of Ask Rate (Lookup Column) to ExRate Column (Number Column) to use in Price (Caculated Column). enter image description here

My Flow: enter image description here

Anyone know what happens, please help me!

And another question: If I don’t have ExRate, I want to use an expression to calculate in the Price Column, please give me the expression?

SharePoint designer workflow to copy a list item to another list which includes a lookup field

I have a need to merge 2 SharePoint lists into a third list. I’m using the copy item command in SharePoint designer. I’ve also tried the create item command.

Both work well until I get to a lookup field. The lookup field doesn’t copy. I tried creating a local variable but that doesn’t work either.

I’m using SharePoint Designer 2013 for an Office 365 implementation. Any help would be greatly appreciated.

Scrapy doesnot yield website urls of DNS Lookup failed websites

I have a list of urls that gets redirected to another url in a text file. I want to get all redirected urls.So I did a spider that opens url from a text file. Now I get some error for few as “DNS look up failed” or “No route”. I checked those urls directly on browser and found those url get “ip address not found error”. However I want scrapy to every redirected urls regardless of the error. Any soulutions to achieve this?

Here is the spider I ran

import scrapy  class AwesomeSpiderSpider(scrapy.Spider):     name = 'web_uk' # opening the list of urls that gets redirected.     f = open("urls.txt")     start_urls = [url.strip() for url in f.readlines()]     f.close()       def parse(self,response):          item = {}         item['Web Address'] = response.url         yield item 

Here is the output

2019-07-04 03:02:03 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://gatesidedevelopments.com/> (referer: None) ['cached'] 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://honka.com/gb/en/> {'Web Address': 'https://honka.com/gb/en/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.dapconstruction.co.uk/> {'Web Address': 'https://www.dapconstruction.co.uk/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.virtueprojects.com> {'Web Address': 'http://www.virtueprojects.com'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://waynemoore.com/> {'Web Address': 'https://waynemoore.com/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.avenuenorth.co.uk/> {'Web Address': 'http://www.avenuenorth.co.uk/'} 2019-07-04 03:02:03 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.galkivconstruction.co.uk/robots.txt> (failed 1 times): DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:03 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.62 Safari/537.36 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.mic.uk.com> {'Web Address': 'http://www.mic.uk.com'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.vlconstruction.co.uk/> {'Web Address': 'https://www.vlconstruction.co.uk/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.whitehalloflondon.co.uk> {'Web Address': 'http://www.whitehalloflondon.co.uk'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.vandthomes.com> {'Web Address': 'http://www.vandthomes.com'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.atlanticdwellings.com/> {'Web Address': 'https://www.atlanticdwellings.com/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.rgfhomeimprovements.co.uk> {'Web Address': 'http://www.rgfhomeimprovements.co.uk'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://leonoc.co.uk/> {'Web Address': 'https://leonoc.co.uk/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 http://www.home-refurbishments.co.uk/> {'Web Address': 'http://www.home-refurbishments.co.uk/'} 2019-07-04 03:02:03 [scrapy.core.scraper] DEBUG: Scraped from <200 https://gatesidedevelopments.com/> {'Web Address': 'https://gatesidedevelopments.com/'} 2019-07-04 03:02:07 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.rrawlins.co.uk/robots.txt> (failed 1 times): An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:07 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36 2019-07-04 03:02:11 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.galkivconstruction.co.uk/robots.txt> (failed 2 times): DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:11 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.45 Safari/537.36 2019-07-04 03:02:12 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.rrawlins.co.uk/robots.txt> (failed 2 times): An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:12 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 2019-07-04 03:02:17 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET http://www.galkivconstruction.co.uk/robots.txt> (failed 3 times): DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:17 [scrapy.downloadermiddlewares.robotstxt] ERROR: Error downloading <GET http://www.galkivconstruction.co.uk/robots.txt>: DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. Traceback (most recent call last):   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1416, in _inlineCallbacks     result = result.throwExceptionIntoGenerator(g)   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/python/failure.py", line 512, in throwExceptionIntoGenerator     return g.throw(self.type, self.value, self.tb)   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request     defer.returnValue((yield download_func(request=request,spider=spider)))   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 654, in _runCallbacks     current.result = callback(current.result, *args, **kw)   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/internet/endpoints.py", line 975, in startConnectionAttempts     "no results for hostname lookup: {}".format(self._hostStr) DNSLookupError: DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:17 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36 2019-07-04 03:02:19 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET http://www.rrawlins.co.uk/robots.txt> (failed 3 times): An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:19 [scrapy.downloadermiddlewares.robotstxt] ERROR: Error downloading <GET http://www.rrawlins.co.uk/robots.txt>: An error occurred while connecting: 113: No route to host. Traceback (most recent call last):   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request     defer.returnValue((yield download_func(request=request,spider=spider))) ConnectError: An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:19 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.86 Safari/537.36 2019-07-04 03:02:22 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.rrawlins.co.uk/> (failed 1 times): An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:22 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.79 Safari/537.36 2019-07-04 03:02:23 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.galkivconstruction.co.uk> (failed 1 times): DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:23 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410.43 Safari/537.31 2019-07-04 03:02:29 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.galkivconstruction.co.uk> (failed 2 times): DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:29 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36 2019-07-04 03:02:29 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://www.rrawlins.co.uk/> (failed 2 times): An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:29 [scrapy_user_agents.middlewares] DEBUG: Assigned User-Agent Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36 2019-07-04 03:02:34 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET http://www.galkivconstruction.co.uk> (failed 3 times): DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:34 [scrapy.core.scraper] ERROR: Error downloading <GET http://www.galkivconstruction.co.uk> Traceback (most recent call last):   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1416, in _inlineCallbacks     result = result.throwExceptionIntoGenerator(g)   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/python/failure.py", line 512, in throwExceptionIntoGenerator     return g.throw(self.type, self.value, self.tb)   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request     defer.returnValue((yield download_func(request=request,spider=spider)))   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 654, in _runCallbacks     current.result = callback(current.result, *args, **kw)   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/twisted/internet/endpoints.py", line 975, in startConnectionAttempts     "no results for hostname lookup: {}".format(self._hostStr) DNSLookupError: DNS lookup failed: no results for hostname lookup: www.galkivconstruction.co.uk. 2019-07-04 03:02:36 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET http://www.rrawlins.co.uk/> (failed 3 times): An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:36 [scrapy.core.scraper] ERROR: Error downloading <GET http://www.rrawlins.co.uk/> Traceback (most recent call last):   File "/home/ubuntu/scrapy/local/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request     defer.returnValue((yield download_func(request=request,spider=spider))) ConnectError: An error occurred while connecting: 113: No route to host. 2019-07-04 03:02:36 [scrapy.core.engine] INFO: Closing spider (finished) 

Using Flow to get Lookup Column Value and use these Value to calculate

I am setting a Flow that can get the value of lookup column for using in Calculated Column. My List below:

List A: Target List, contain Currencies and Exchange Rate to VND (Called exRate) of the Currencies enter image description here

List B: Source List, Product List, want to calculate price in VND enter image description here

I want the Column ExchangeRate in List B to get data from Lookup Column “Currencies: exRate” and the “Price in VND” column calculate = OriPrice x ExchangeRate. And when the “ExRate” in List A change, then update “Exchange Rate” and “Price in VND” column  in List B.

I want to use MS Flow and also Sharepoint Workflow 2013.

Please give me an advice.

Thanks