“Report Server Central Administration Feature” is missing in Central Administration Site collection features

I am using SharePoint Server 2016 and SQL Server 2016 SP1. I installed “Reporting Services – SharePoint” feature on the Application server(Custom Role) and “Reporting Services Add-in for SharePoint Products” feature on the Web Front-end Server. I am also able to create the “SQL Server Reporting Services Service Application”.

We have migrated from SharePoint 2013 to SharePoint 2016 and have a Document library with below reporting content types:

  • Report Builder Model
  • Report Builder Report
  • Report Data Source

After migration I see some faulty content types in target environment. Refer to below screen-shot.

enter image description here

from the above image I was able to understand the features may not be activated. But I could not find the reporting services features in Central Administration and site collection to activate them. Below are the features am looking for.

  • “Power View Integration Feature” site collection feature.
  • “Report Server Central Administration Feature” central administration site collection feature.

Daily report on SharePoint

I’ve got a list with ~2000 items. I’m hoping to add a daily status update (i.e. Up/Down) to each that i can then use the results to make monthly/yearly/etc.. trend charts. Is there a clean way to do this without lists getting out of control large?

edit: To keep it simple, I don’t really need to know how to get the data, just wondering on the best way of setting it up. Lets say I have a thermometer and each day I want someone to check that and add the value to that item on the list (therm. 1) but have access to all the previous values for trending. Now i have 2000 thermometers, what would be a good way to track all that data? All I can come up with is versioning, I’m not sure that would be the best way.



Dynamic report generation

i have a reporting requirement wherein the data visualizations and reporting would be done via zeppelin tool. zeppelin can connect to only 1 datasource at a time. in my current application, the user data is kept in a separate db(db1) and the mapping is kept separately in another db inside my application. the contents of db1 changes continuously. suppose if i need to create a new project, a new db would be created and the columns required would be chosen dynamically. currently a java code does this job of combining both(the mapping and actual data) and creates the reports. now i need to move to zeppelin for further advancements in analytics.

how do i achieve this?

Tool to prepare report of security auditing

I was doing dorking on my friend’s website to make sure there is no leak of information that is not meant to be. I found domains and sub-domains in search results.

Is their a tool which can help me to prepare the report? Report like domains and subdomains are represented in some graphical format something like tree structure?

And also what tools should be used to capture details during target enumeration phase?

Popularity trend report always 0

DETAILS BELOW enter image description here

Ok, so ive now been troubleshooting why our Popularity reports are always getting a zero and i feel i am very close.

Log files are being populated databases are being populated usage files are being created

Event Store folder is empty! ( Yes i have checked the sharepoint job Data import and it is enabled )

However i then checked the job which is suppose to run overnight

$ job = Get-SPTimerJob -Type Microsoft.Office.Server.Search.Analytics.UsageAnalyticsJobDefinition $ job.GetAnalysisInfo()

and Wallah! it hasnt been ran but i cant seem to find that job in CA to enable it.

I have 8Gb RAM, but the system report 6.8 Gb. How to solve it?

I have 8Gb RAM, but the system report 6.8 Gb and in bios, it’s says 8Gb. I don’t have any swap on it. I’m on Lubuntu 18.04 64bit.

img here

sudo lshw -C memory   *-firmware                        description: BIOS        vendor: AMI        physical id: 0        version: F.10        date: 05/17/2018        size: 64KiB        capacity: 15MiB        capabilities: pci upgrade shadowing cdboot bootselect edd int13floppy1200 int13floppy720 int13floppy2880 int5printscreen int9keyboard int14serial int17printer acpi usb smartbattery biosbootspecification netboot uefi   *-memory        description: System Memory        physical id: 9        slot: System board or motherboard        size: 8GiB      *-bank:0           description: SODIMM DDR4 Synchronous Unbuffered (Unregistered) 2400 MHz (0,4 ns)           product: HMA81GS6AFR8N-UH           vendor: Hynix           physical id: 0           serial: 825AC392           slot: Bottom - Slot 1 (left)           size: 8GiB           width: 64 bits           clock: 2400MHz (0.4ns)      *-bank:1           description: SODIMM [empty]           product: Unknown           vendor: Unknown           physical id: 1           serial: Unknown           slot: Bottom - Slot 2 (right)   *-cache:0        description: L1 cache        physical id: b        slot: L1 - Cache        size: 192KiB        capacity: 192KiB        clock: 1GHz (1.0ns)        capabilities: pipeline-burst internal write-back unified        configuration: level=1   *-cache:1        description: L2 cache        physical id: c        slot: L2 - Cache        size: 1MiB        capacity: 1MiB        clock: 1GHz (1.0ns)        capabilities: pipeline-burst internal write-back unified        configuration: level=2   *-cache:2        description: L3 cache        physical id: d        slot: L3 - Cache        size: 4MiB        capacity: 4MiB        clock: 1GHz (1.0ns)        capabilities: pipeline-burst internal write-back unified        configuration: level=3 

How does VirusTotal check if a file is malicious or not and how trustworthy is its report?

So i have seen many times that people provide the virus total report for a file to prove its not a virus

but i just checked and when i submit a file its not even uploading it (i checked my bandwidth), its just computing its hash somehow without uploading it, I’m pretty sure I’m not uploading the file because a 10mB executable file takes half a second to “load” in their website and check even tho my upload speed is 30KB/s! (tried renaming the file too but still only took half a second)

so i have three questions:

  1. when i submit a file, how does it upload it so fast? is my file even getting uploaded? if not, then how does it compute the hash?

  2. does it say a file is malicious only if the hash of that file is present in the database of a AV company? if so doesn’t this mean i can easily bypass it by changing a the PE/ELF file a bit?

  3. is there any better alternative that actually performs some static or even dynamic analysis on the file to check if its malicious or not?

Unable to View SSRS Report after Deployment

Apologies for the long post: I have a SharePoint 2013 server with Reporting Services 2014 installed in SharePoint Integrated mode, plus a document library with Reporting Services content types enabled. I have a number of existing reports that have been published to a report folder in the document library which can be viewed without any issues.

The SharePoint server is in one domain (Domain A) but the developers who create the reports (including me) have development laptops that are in a different domain (Domain B).

Until now, we have been deploying reports by uploading a zip file to the SharePoint server and deploying the reports using a custom PowerShell script which also updates shared data source and shared dataset references – this method makes it easy for us to deploy the reports to different environments but is quite time-consuming to package up and run.

To speed up our workflow, we wanted to try deploying the reports to our development server direct from Visual Studio, as the deployment process is much quicker.

I have a Reporting Services project and tried deploying to the document library direct from Visual Studio 2017 using the TargetReportFolder and TargetServerURL properties.

The deployment was successful but on viewing the reports after deployment from my local machine, I got this error message: “Error – for more information about this error navigate to the report server on the local server machine, or enable remote errors”.

On logging in to the server (with my credentials for Domain A) and clicking on the same reports, the reports displayed without any errors. The report folder and the individual reports have the same permissions inherited from the root site, and all developers’ Domain B logins are members of the Site Owners group (which is required in order to be able to deploy from Visual Studio).

Why do I get an error when trying to view the reports from my local machine but not on the server?