VS Debug not accepting the command line argument provided in project properties [on hold]

A cpp project my team is working on in Visual Studio 2017 Community (v15.5.1) compiles to an .exe that can be run from a terminal with the command line argument pointing to a set of inputs. Normally as I work on the code I run it in debug mode within VS, setting the command line argument by right clicking the main Project, going to Properties > Debugging > Command Argument and entering the path to the input set there same as I would after the .exe name if running it in a terminal.

The problem is that when I compile in Debug mode, my command argument is not used and instead an older command argument is used, whatever argument was last committed in our version control for the solution’s .vcsproj.user file. I check out the latest copy, modify the command argument to my own path, run debug and it uses the command argument from whatever I had last checked out, ignoring the path I provide.

What could be going wrong here? How do I get VS debug mode to use the path I provide? The only diff I find from version control is my change to .vcsproj.user file, in the tag . My working copy has the path I provided in VS Project Properties for Command Argument, and yet running Debug mode doesn’t use that path.

To confirm the EXE is working as expected, if I compile it for release and then run it from a terminal, it accepts whatever command line argument I pass it in terminal correctly.

Ubuntu 18.16 – Keyboards (Laptop, USB & Virtual) Not Accepting Input on Logon

I cant relogon to my laptop because no keyboard, mouse or touchpad input is being accepted at logon. For keyboard this includes laptop keyboard, external USB keyboard and virtual onscreen keyboard, the latter not being accessible because tab, enter key, mouse and touchpad movements/input do not work to select onscreen numbers and letters.

Specs: Ubuntu 18.16 Full USB installation that has been working for ~ 2 months HP EliteBook Laptop;

  1. I am not a complete Unix, Ubuntu newbie at all but have come back to it recently after 2 years away so general debug or recovery mode advice would be appreciated.

  2. I don’t think it’s a hardware or BIOS issue because I’m typing this question on the same laptop using Ubuntu on another USB key. So the USB stick I’m using now has the same hardware and BIOS settings as the problematic installation on the other USB stick. The other stick is a full installation that has all my settings, customizations and data for two months on the USB.

  3. I suspect changes I made yesterday to fix a brightness issue is causing the current issue. I do not recall all the steps but I’ll describe in general what I did to successfully fix the issue of the brightness not being adjustable on the full USB installation.

    a. installed brightness-controller, brightness-controller-simple and xbacklight. (Not sure if package names are exact.)

    b. xbacklight installation in particular had some dependencies that required installing additional packages which I don’t have the names of right now (hoping to research to find the pages of instructions I followed, but even that won’t be complete because I installed some things that weren’t in the instructions based on messages I received on dependencies when doing the install). Edit: these are packages I installed sudo apt install xbacklight xorg xserver-xorg-video-intel. The last package had dependencies requiring additional installs.

    c. I had made the following grub change and updated grub about a week ago and have rebooted many times since. It didn’t fix the brightness issue then but this change is part of xbacklight install instructions I followed. Since the change had already been made to grub a week ago I did not run grub update again yesterday:

    GRUB_CMDLINE_LINUX_DEFAULT=”quiet splash acpi_backlight=vendor”

    d. Edit: I created and added following to /etc/X11/xorg.conf as part of xbacklight install:

Section “Device” Identifier “Device0” Driver “intel” Option “Backlight” “intel_backlight” EndSection

Section “Monitor” Identifier “Monitor0” EndSection

Section “Screen” Identifier “Screen0” Monitor “Monitor0” Device “Device0” EndSection

Any help or leads are very much appreciated.

How to find the minimum number of states of a deterministic finite automaton accepting a given language

Let

  1. L be a language over $ \sum$ . And $ \sum$ = {0,1} is a set of input alphabets.
  2. L = { w | w $ \in$ $ \sum^*$ , where w is a string with numbers of 0s divisible by 3 and number of 1s divisible by 5} .

What will be the minimum numbers of states in the DFA accepting L.

My approach is to create a grid like this:

enter image description here

I am getting 6 x 4 = 24 states.

Can this be minimized futher. Are there any method to solve this kind of problems ?

Min function accepting varying number of arguments in C++17

Come across this problem once again in the book The Modern C++ Challenge (problem 18). Wonder how simple and elegant the implementation could be using C++17. Following is my solution. Ideas? ^_^

#include <algorithm>  template <typename Less, typename T, typename... Ts> constexpr const T& min(Less less, const T& a, const T& b, const Ts&... rems) {   if constexpr (sizeof...(rems)) {     return min(less, std::min(a, b, less), rems...);   }   else {     return std::min(a, b, less);   } } 

Short analogue of Riffle[] accepting two blocks

Imagine that I have an array of some elements that I need to loop through (i.e. execute some block A on each element) but between them I have also to “clean up” (block B) from the previous iteration and that I don’t need to clean after the last iteration because B is slow and it is done somewhere further in more efficient way like process exit or whatever. If the array is [1,2,3] the flow should be like:

[1, :IN] [1, :OUT] [2, :IN] [2, :OUT] [3, :IN] 

It is similar to Mathematica Riffle[] function. I already implemented it in Ruby three years ago but that implementation is complex and there is a place only for B, not for A. So I rewrote it for my current case:

riffle = lambda do |enumerable, &block|   prev = nil   enumerable.each_with_index do |e, i|     block.call prev unless i.zero?     prev = e   end end  riffle.call( Enumerator.new do |e|   [1,2,3].each do |i|     e << i      # start block A     p [i, :IN]     # end    end end ) do |i|    # start block B   p [i, :OUT]       # end  end 

Is there a way to make less code and maybe not use Enumerator.new? I tried to use .to_enum and .lazy.map(&A) but had no success.

Decide if a string is in a language without simulating the automata accepting the languge

Is it possible for a Turing machine with input of a DFA that accepts a finite language and a string to decide whether the string is in the language?

More formally, is there a TM $ M$ that can, without fully simulating any string, decide the language:

$ L=$ {$ <D,w>$ | $ D$ is a DFA that accepts a finite language and $ D$ accepts $ w$ }?

I was thinking it is not possible, though I am not sure because maybe one can deduce from the encoding of the DFA without simulating the DFA what is the language.

I can’t see how this can be done but not sure how to prove it can’t be done.

Pentax DSLR stopped accepting manual lenses

I’ve upgraded K-S1 camera firmware to latest version, played with some settings and reset them to defaults.
For some reasons, manual (K) lenses aren’t accepted anymore. Modern auto-focus lenses work fine but for my old lenses, the camera doesn’t show focal length setting when turned on. Display shows flashing F– and I can’t take pictures in any mode.