Can you define a ‘discrete’ language?

Are the following appropriate definitions for a formal languages over the alphabet {0,1}?

Example1: An argument w is a member of L under the following rules:

  1. If more than half its digits are 1’s –> it has to be a member of decidable language A

  2. If more than half its digits are 0’s –> it has to be a member of decidable language B

  3. If exactly half of its digits are 1’s and half are 0’s then it is not a member of the language.

Example 2: w is a member of L if:

  1. If w is longer than 10 bits it has to not be a member of decidable language A (with decidable complement) to be a member of L

  2. if w is 10 bits or less it has to be a member of decidable language B to be a member of L.

The general question: is the above ‘discrete’ form of language definition acceptable?

The same way a function can be discrete or continuous I am nicknaming this a ‘discrete’ definition for a language because based on what type of input you are, your rule (reason) for membership/non-membership can be different from other arguments’. I would assume this is ok? There does exist an argument that all discrete functions are not computable, but I don’t think this argument holds if all the inputs are of finite precision (as is the case with finite binary strings)