I use the Workflow History List for debugging and the option of turning on debugging for flows that are in production. I would however like a workflow that purges the records in this list based by the workflow Name. I have created a workflow and a Delete HTTP request, but can’t get it to work. I can’t find any documentation on how to do this. What I have is
[%Workflow Context:Current Site URL%]/_api/lists/GetByTitle('Workflow
History’)?$ select=Id as my String in my Call
I get the following response code:
I think I’m getting it, but still don’t know how to put the pieces together. My understanding now is that I create a HTTP Rest to select the IDs of each record. Then I have to follow it with two more HTTP requests one to get the item and the other to delete the item. It is these later two that I don’t know how to do. Am I on the right track?
I am looking for some help here.
I have a need to find all infopath forms in a SharePoint online environment. I need to run through all site collections (alot of them) and write to csv file for review. I have noticed there is a powershell script here: infopath script how ever when I run it it keeps prompting me for login details for the next site collection. I have so many site collections is there anyway to stop this and keep running the script until the end automatically?. Also is there a way to output this to a csv file for review???
I mapped a SharePoint online document library(Y:) and was planning to use robocopy to synchronize a local directly to the library. The problem that I am enountering is that it always views the source(local) files as newer. I used /XO, /FFT, and /M and it still lists all source files as new and remote files as extra. If I actually attempt to copy it copies all of the files.
robocopy "S:\Job Descriptions" "Y:" /L /XO /FFT
The conflicting files are the same size and have the same name while the local files are older. For example the modified date of the old file is 7/8/2016 and the new file 5/19/2019.
Any suggestions that could explain why robocopy is reporting this? Should I just use powershell instead?
We have one page on our SharePoint site that I would like to share with a user group and would only like that page tobe viewable to them. Is it possible for only that page to show up for the group or does everything have to be viewable to them?
The usergroup only has a read permission.
I am trying to make the following happen on a list: If a user selects “Most” as a choice, then another column will return as 1. If a user selects “Somewhat” then that column will return a .5. If a user selects “None” then that column will return a .0. Any advice on the formula syntax?
I have a requirement where a Microsoft Flow should export the files from SharePoint online Document library to a FTP server
In my enterprise search page, I want users to be able to filter their results by choosing ‘site’ metadata. I’ve already setup a filter for site name using the ows_SiteName crawled property – it works great and it applies to the content of the site in search! But I have other metadata about each site that I want users to filter on – let’s use ‘Client’ as an example. I’ve setup this other metadata as properties in the Site Collection PropertyBag and marked them as indexable. This works! I mapped the crawled property to a managed property, and setup my filter to include this property. This works to an extent but the property only seems to filter results from search which are ‘sites’. I want the filter to also apply to the content of my site (e.g. documents in a library).
I’m a Sharepoint Farm Admin, and am in charge of permission reviews for requested Sharepoint add-ins. In the past, we have judged certain app requests as too permissive and denied them based on the description-only from the request itself.
My issue is our group does not fully agree upon what each description means. Therefore our assessments are inconsistent.
► Example 1: “Let it access basic information about the users of this site”
(1) What basic information? userid, displayname, email, but not what…manager?
(2) Microsoft loosely switches between calling site collections “sites” and site/subsite “webs”. So will that mean users of the site collection or the subsite level?
I understand about policies UserOnly AppOnly, User+App; but since we only have access to the text description in the App Store, when is which policy applied?
► Example 2: “Let it create or delete document libraries and lists in this site collection.” (DOCUSIGN FOR SHAREPOINT)
The implication is that is User+App based on other “Let it…” permissions. But because it is for signing documents, doesn’t feel right that it would require the USER to have list management rights to function as expected.
I know there are limitations with SP online regarding finding out last crawl time etc.. but this is a pain:( I have re-index libraries,list and sites but no idea when is the schedule time or even show me when it was last crawled etc..
Is there a way to get this info please or even via rest api? Thanks in Advance