Detecting Magick use by using the Prime Sphere

I have a Mage: the Ascension 20th Anniversary Edition game running and one of the players played a game of Mage the Ascension Second Edition before that with a different Storyteller.

In that chronicle they used the prime sphere as a universal detection spell for the use of magick (however that player is unsure if that was just a houserule).

The rulebook of M20 states inside the description of what you can do with the first dot of the Prime Sphere on page 520 (highlights added by the asker):

She may spot energetic ebbs and flows, can sense and at least try to read Resonance and Synergy signatures, and could also absorb Quintessence into her personal Pattern.

One wiki states for a 1 dot rote of the prime sphere:

Etheric Senses: The mage can perceive Quintessential energy, and is alerted when someone uses magic in their vicinity. source

They state Mage: The Ascension Revised Edition Pg. 179-180 as the source for this rote together with page 520 in the M20 core rulebook.

Is that a correct application of rules as written in Mage the Ascension Second Edition and would the rules of Mage: the Ascension 20th Anniversary Edition allow for a similar reading?

Help flushing out idea for detecting friend from foe [closed]

I need help with flushing out an idea into a game mechanic. There is a goal and there are friendly npcs and enemies. Both are walking toward the goal. However, the player can’t tell friend from foe.

The goal is to have the player detect and eliminate the enemies before they reach the goal. I mostly need help with detection. Most examples and ideas I see are for ai detecting the player and not the other way around.

What are some ways, examples, or ideas I can use to implement this sort of mechanic?

Detecting changes in surrounding Wi-Fi networks

I’m considering developing a simple WiFi scanner and logging App that can run as a service 24 hours a day for months passively collecting changes in the surrounding wireless environment with the following features:

  • Log changes to BSSDs, ESSDs SSIDs, signal strength, number of clients, etc.
  • Displaying everything on a simple dashboard;
  • Notifications of new access points setup in the surrounding area;
  • Notifications of new WiFi clients;
  • Notifications of SSID changes;
  • Possibly capture handshakes (though that’s not a primary objective);
  • Run on a low cost platform such as a Raspberry Pi with a simple apt-get install to get it started.

I’m very well aware that other solutions exit such as Kismet Logging but I don’t want invest too much time recreating a solution that already exists. Is there something that already does all of this, or a combination of tools I can try? E.g. Kismet & Kibana in a Docker image?

separate prerendered static page for open graph crawlers on netlify ( cant redirect by detecting bots ) [closed]

I want to create high browser cached shell app which is basically one cached file index.html which works like spa and it uses progressive enchantment to fill with content, but there is problem for open graph meta tags for Facebook, LinkedIn, Twitter, and of course with SEO also ( yes I know that these days google can parse JavaScript oriented applications but others can’t ), and we cant redirect on server side by bot detection because we are using static file hosting like netlify, so basic idea is this:

put canonical url for every page to redirect crawler to /seo/$ page, and on /seo subfolder we will be serving pre-rendered static pages which contains all open graph tags and also don’t contain all the css ans JavaScript, just content and html tags ( this is not necessary but idea ise to optimize unnecessary bandwidth )

Would this solution be considered good practice? Would this solution be considered cloaking? What are downsides of this solution? Is it considered bad practice to serve “striped down” pages to bots? ( with same content but not all functionality ) Do you maybe have any other proposition how to handle static pre-rendered pages that will be used for SEO and open graph tags and be served only to bots

and most important question is beacuse i suppose that links in google search will then be with additional parameter /seo/ which is not good, is there any solution to force google to use original links or to redirect to /og/ urls just for serving open graph tags for facebook and other social network regarding information that google actually can parse javascript today.

Would sitemap.xml or robots.txt somehow be helpful for redirecting just facebook parser?

Detecting conservation, loss, or gain in a crafting game with items and recipes

Suppose we’re designing a game like Minecraft where we have lots of items $ i_1,i_2,…,i_n\in I$ and a bunch of recipes $ r_1,r_2,…,r_m\in R$ . Recipes are functions $ r:(I\times\mathbb{N})^n\rightarrow I\times\mathbb{N}$ , that is they take some items with non-negative integer weights and produce an integer quantity of another item.

For example, the recipe for cake in Minecraft is:

3 milk + 3 wheat + 2 sugar + 1 egg $ \rightarrow$ 1 cake

… and the recipe for torches is:

1 stick + 1 coal $ \rightarrow$ 4 torches

Some recipes could even be reversible, for example: 9 diamonds $ \leftrightarrow$ 1 diamond block

If there’s some combination of recipes we can repeatedly apply to get more of the items that we started with then the game is poorly balanced and this can be exploited by players. It’s more desirable that we design the game with recipes that conserve items or possibly lose some items (thermodynamic entropy in the real world – you can’t easily un-burn the toast).

Is there an efficient algorithm that can decide if a set of recipes will:

  • conserve items?
  • lose items to inefficiency?
  • gain items?

Is there an efficient algorithm that can find the problematic recipes if a game is imbalanced?

My first thoughts are that there is a graph structure / maximum flow problem here but it’s very complex, and that it resembles a knapsack problem. Or maybe it could be formulated as a SAT problem – this is what I’m considering to code it at the moment but something more efficient might exist.

We could encode recipes in a matrix $ \mathbf{R}^{m \times n}$ where rows correspond to recipes and columns correspond to items. Column entries are negative if an item is consumed by a recipe, positive if it’s produced by the recipe, and zero if it’s unused. Similar to a well known matrix method for graph cycle detection, we could raise $ \mathbf{R}$ to some high power and get sums of each row to see if item totals keep going up, stay balanced, or go negative. However, I’m not confident this always works.

Any discussion, code, or recommended reading is very appreciated.

MAKE not detecting nested directories for #include

I’m attempting to build OBS-studio-webrtc (live streaming software) from source on a Linux box running Ubuntu 18.04.

There are two main git clones – OBS-studio-webrtc itself, and the webrtc package.

I’m encountering a recurring issue in which most of the .h and .cpp files (possibly all; haven’t checked the 267,544 files in the webrtc directory) do not recognize the directories implied in the #include calls.


audio_processing.h, located at webrtc/include/audio, contains the initial includes:

#include "rtc_base/criticalsection.h"  #include "rtc_base/thread_annotations.h" 

both of which are located in webrtc/include/rtc_base

but running -make returns “No such file or directory”.

Changing the syntax to <rtc_base/thread_annotations.h> produces the same result.

I can write the full directory "home/arctos/Desktop/stream/webrtc/rtc_base" for each include call and that works, but the webrtc package is massive and I’d rather find the root cause as opposed to working through each fatal error one by one.

I’ve used a variety of different OBS-studio-webrtc forks as well as a few different webrtc packages but this same error always presents itself. The webrtc directories and include files are all listed on the Findlibwebrtc.cmake file and appear in the output when

cmake -DUNIX_STRUCTURE=1 -DBUILD_BROWSER=ON -DCEF_ROOT_DIR="../../OBS-studio-webrtc" .. 

is called prior to the -make call. I’ve experimented with installing webrtc both as local user and as root, but no difference. I currently have the builds inside a single folder on the desktop.

I’m assuming the issue is my unfamiliarity with CMAKE, but I’m unable to resolve the issue.