Navigation Font Size Help

[​IMG] Can someone please help. I need to figure someway to increase the font size of the navigation links at the bottem of my page, my guy that runs the site for me has been in a bad accident and im trying to figure this out but having a hard time, I just need to increase the size of the fonts on the links at the bottom of the page. Can someone please help.

the page is www.jacksonvilletilepro.com

Step size is effectively zero; singularity or stiff system suspected

\[Eta] = 0.125; rg1s = Derivative[1][u1][t] - u1[t]*u3[t] - 3*u1[t]*u4[t] + u2[t]*u3[t] + 3*u2[t]*u4[t] + 2*(1 + 2*\[Eta])*u1[t]^2 == 0; rg2s = Derivative[1][u2][t] - u2[t]*u3[t] - 3*u2[t]*u4[t] + u1[t]*u3[t] + 3*u1[t]*u4[t] + 2*(1 + 2*\[Eta])*u2[t]^2 == 0; rg3t = Derivative[1][u3][t] - 0.5*((u1[t] - u2[t])^2 + u3[t]^2 - 3*u4[t]^2 + 6*u3[t]*u4[t]) == 0; rg4t = Derivative[1][u4][t] - 0.5*((u1[t] - u2[t])^2 + u3[t]^2 + 5*u4[t]^2 - 2*u3[t]*u4[t]) == 0; sol = NDSolve[{rg1s, rg2s, rg3t, rg4t, u1[0] == -0.6*0.2, u2[0] == -0.6*0.2, u3[0] == 0.6*0.2, u4[0] == 0.2}, {u1, u2, u3, u4}, {t, 1, 10}]I am trying to solve four coupled nonlinear differential equations. But, everytime I am getting NDSolve::ndsz: At t == 2.0833315360868916`, step size is effectively zero; singularity or stiff system suspected. I have tried all the possible ways to solve such kind of problem with similar kind of problems available in stack. I want to get the value of g1,g2,g3 and g4 for large value of t i.e. 1 to 100 but I am getting only for small value of t? Could you please suggest me some way to sort out the issue?

Thank you

When does NDSolve issue zero step size error NDSolve::ndsz?

I want to detect directly when step size becomes "effectively" zero. The below example from the documentation throws the error message as expected:

s = {}; NDSolve[{(2 - f[x]) f'[x] == f[x], f[0] == 1}, f, {x, 0, 5}, StepMonitor :> AppendTo[s, x]];  NDSolve::ndsz: At x == 0.3862940268757776`, step size is effectively zero; singularity or stiff system suspected. 

The below code indicates that none of the actual steps taken has zero length.

AnyTrue[Differences@s, PossibleZeroQ]  (* False *) 

How does NDSolve decide that the step size is zero? I can of course capture the NDSolveValue::ndsz error, but I want to know when exactly (depending on what parameters) the error is issued. In some extreme cases, NDSolve may generate an InterpolatingFunction solution that has practically zero domain length (but not according to PossibleZeroQ).

Does the Monstrous Combat Options rule of breath weapons scaling with creature size apply to PC’s?

While discussing build options with a fellow party member (an Alchemist), they mentioned a rule granting breath weapons increased area when the user increases size category. After a bit of searching, I was able to find the rule in question:

Breath Weapon: The monster gains a breath weapon that deals 1d6 points of damage + 1d6 per CR. A target can attempt a Reflex saving throw to take half damage. If the breath weapon is a cone, it’s 30 feet long, increasing by 10 feet for each size category above Medium, and decreasing by 5 feet for every size category below Medium. If the breath weapon is a line, its area of effect is twice as long as a cone would be.

However, said rule was located in a section about monster creation/advancement, and now we’re debating whether or not it applies to PC’s changing size due to spells or buffs. Is there any other RAW indication whether or not gaining or losing a size category would generally affect breath weapon areas in this manner, or would this ultimately be a matter of GM fiat?

Identified Folder does not reduce in size over time

Hi @Sven

I am trying to figure out how to exhaust my identified folder to make it all verified (how to find more verified faster).

When I monitored over last 2 days, with a project that only uses Identified links, these are my folder sizes:

Day 0 
Identified 494mb
Submitted 549mb
Verified 829mb

Day 2
Identified 511mb
Submitted 572mb
Verified 857mb

My expectation would be the Identified folder reduces in size as  links are deleted from this folder when they are submitted/verified, however the Identified folder is increasing in size instead??

Am I missing something?

What does a kernel of size n,n^2 ,… mean?

So according to Wikipedia,

In the Notation of [Flum and Grohe (2006)], a ”parameterized problem” consists of a decision problem $ L\subseteq\Sigma^*$ and a function $ \kappa:\Sigma^*\to N$ , the parameterization. The ”parameter” of an instance $ x$ is the number $ \kappa(x)$ . A ”’kernelization”’ for a parameterized problem $ L$ is an algorithm that takes an instance $ x$ with parameter $ k$ and maps it in polynomial time to an instance $ y$ such that

  • $ x$ is in $ L$ if and only if $ y$ is in $ L$ and
  • the size of $ y$ is bounded by a computable function $ f$ in $ k$ . Note that in this notation, the bound on the size of $ y$ implies that the parameter of $ y$ is also bounded by a function in $ k$ .

The function $ f$ is often referred to as the size of the kernel. If $ f=k^{O(1)}$ , it is said that $ L$ admits a polynomial kernel. Similarly, for $ f={O(k)}$ , the problem admits linear kernel. ”’

Stupid question, but since the parameter can be anything can’t you just define the parameter to be really large and then you always have linear kernel?

Can a warlock with Repelling Blast use Eldritch Blast to push 10 feet a creature of any size?

From the RAW ruling, unless I am missing official errata or clarifications documents from WotC I do not see any size limitations to the use of the Repelling Blast power of Eldritch Blast.

So it means you could push back up to 10 feet a creature of any sizes regardless of any context, weight, mass or your own size ?

How could a key could be inserted in a heap without increasing the size of an array?

MAX-HEAP-INSERT(A, key)     A.heap-size = A.heap-size + 1     A[A.heap-size] = -infinity     HEAP-INCREASE-KEY(A,A.heap-size,key) 

How could a key could be inserted in a heap without increasing the size of an array? With this code from Introduction To Algorithms, you can’t just randomly increase the heap size upon wish. Did I miss something? All all online lectures I have seen do no talk about this issue. Neither does the book touch this issue. Or is it that the lowest key in an array would be dropped automatically?

Is it possible for the runtime and input size in an algorithm to be inversely related?

I’m wondering if it’s possible for algorithms that have monotonically decreasing runtime with the input-size – just as a fun mental exercise. If not, is it possible to disprove this claim? I haven’t been able to come up with an example or counterexample so far, and this sounds like an interesting problem.

P.S. Something like $ O(\frac{1}{n})$ , I guess (if it exists)