Feature/main menu (Sourcery refactored)#2
Feature/main menu (Sourcery refactored)#2sourcery-ai[bot] wants to merge 1 commit intofeature/main_menufrom
Conversation
| if (pdb_line.split())[0] == "ATOM" or (pdb_line.split())[0] == "HETATM": | ||
| pdb_line_terms["RECORD TYPE"] = pdb_line[0:6].replace(" ", "") | ||
| if (pdb_line.split())[0] in ["ATOM", "HETATM"]: | ||
| pdb_line_terms["RECORD TYPE"] = pdb_line[:6].replace(" ", "") |
There was a problem hiding this comment.
Function get_pdb_line_components refactored with the following changes:
- Replace multiple comparisons of same variable with
inoperator (merge-comparisons) - Replace a[0:x] with a[:x] and a[x:len(a)] with a[x:] [×3] (
remove-redundant-slice-index)
| pdb_line_components["RECORD TYPE"] == "ATOM" | ||
| or pdb_line_components["RECORD TYPE"] == "HETATM" | ||
| ): | ||
| if pdb_line_components["RECORD TYPE"] in ["ATOM", "HETATM"]: |
There was a problem hiding this comment.
Function assemble_pdb_line_components refactored with the following changes:
- Replace multiple comparisons of same variable with
inoperator (merge-comparisons)
| for key in pdb_line_terms.keys() | ||
| for key in pdb_line_terms |
There was a problem hiding this comment.
Function get_missing_terms refactored with the following changes:
- Remove unnecessary call to keys() (
remove-dict-keys)
| complex_lines: list[str] = [] | ||
| with open(f"{root_dir}/{protein_pdb}_processed.gro", "r") as protein_f: | ||
| protein_lines = protein_f.readlines() | ||
| with open( | ||
| f"{root_dir}/{ligand_id}_fix.acpype/{ligand_id}_fix_GMX.gro", "r" | ||
| ) as ligand_f: | ||
| ligand_lines = ligand_f.readlines() | ||
| for line in protein_lines[2:-1]: | ||
| complex_lines.append(line) | ||
| for line in ligand_lines[2:-1]: | ||
| complex_lines.append(line) | ||
| complex_lines: list[str] = list(protein_lines[2:-1]) | ||
| complex_lines.extend(iter(ligand_lines[2:-1])) |
There was a problem hiding this comment.
Function combine_prot_lig_gro refactored with the following changes:
- Move assignment closer to its usage within a block (
move-assign-in-block) - Convert for loop into list comprehension (
list-comprehension) - Replace a for append loop with list extend (
for-append-to-extend) - Simplify generator expression (
simplify-generator) - Replace identity comprehension with call to collection constructor (
identity-comprehension)
| default_params_for_ions_and_em: Dict[str, Any] = { | ||
| "integrator": self.integrator, | ||
| "nsteps": self.nsteps, | ||
| "emtol": self.emtol, | ||
| "emstep": self.emstep, | ||
| "cutoff_scheme": self.cutoff_scheme, | ||
| "nstlist": self.nstlist, | ||
| "pbc": self.pbc, | ||
| "ns_type": self.ns_type, | ||
| "rlist": self.rlist, | ||
| "coulombtype": self.coulombtype, | ||
| "rcoulomb": self.rcoulomb, | ||
| "rvdw": self.rvdw, | ||
| } | ||
|
|
||
| default_params_for_nvt_npt_md: Dict[str, Any] = { | ||
| "integrator": self.integrator, | ||
| "dt": self.dt, | ||
| "nsteps": self.nsteps, | ||
| "nstlog": self.nstlog, | ||
| "nstenergy": self.nstenergy, | ||
| "nstxout_compressed": self.nstxout_compressed, | ||
| "cutoff_scheme": self.cutoff_scheme, | ||
| "nstlist": self.nstlist, | ||
| "pbc": self.pbc, | ||
| "ns_type": self.ns_type, | ||
| "rlist": self.rlist, | ||
| "coulombtype": self.coulombtype, | ||
| "rcoulomb": self.rcoulomb, | ||
| "vdwtype": self.vdwtype, | ||
| "vdw_modifier": self.vdw_modifier, | ||
| "rvdw_switch": self.rvdw_switch, | ||
| "rvdw": self.rvdw, | ||
| "DispCorr": self.DispCorr, | ||
| "fourierspacing": self.fourierspacing, | ||
| "pme_order": self.pme_order, | ||
| "tcoupl": self.tcoupl, | ||
| "tc_grps": self.tc_grps, | ||
| "tau_t": self.tau_t, | ||
| "ref_t": self.ref_t, | ||
| "pcoupl": self.pcoupl, | ||
| "gen_vel": self.gen_vel, | ||
| "constraints": self.constraints, | ||
| "constraint_algorithm": self.constraint_algorithm, | ||
| "continuation": self.continuation, | ||
| "lincs_order": self.lincs_order, | ||
| "lincs_iter": self.lincs_iter, | ||
| } | ||
|
|
||
| if self.sim_type in sim_types and self.sim_type == "ions": | ||
| return default_params_for_ions_and_em | ||
|
|
||
| elif self.sim_type in sim_types and self.sim_type == "em": | ||
|
|
||
| additional_params: Dict[str, Any] = { | ||
| if self.sim_type in sim_types: | ||
| default_params_for_ions_and_em: Dict[str, Any] = { | ||
| "integrator": self.integrator, | ||
| "nsteps": self.nsteps, | ||
| "emtol": self.emtol, | ||
| "emstep": self.emstep, | ||
| "cutoff_scheme": self.cutoff_scheme, | ||
| "nstlist": self.nstlist, | ||
| "pbc": self.pbc, | ||
| "ns_type": self.ns_type, | ||
| "rlist": self.rlist, | ||
| "coulombtype": self.coulombtype, | ||
| "rcoulomb": self.rcoulomb, | ||
| "rvdw": self.rvdw, | ||
| } | ||
|
|
||
| default_params_for_nvt_npt_md: Dict[str, Any] = { | ||
| "integrator": self.integrator, | ||
| "dt": self.dt, | ||
| "nsteps": self.nsteps, | ||
| "nstlog": self.nstlog, | ||
| "nstenergy": self.nstenergy, | ||
| "nstxout_compressed": self.nstxout_compressed, | ||
| "cutoff_scheme": self.cutoff_scheme, | ||
| "nstlist": self.nstlist, | ||
| "pbc": self.pbc, | ||
| "ns_type": self.ns_type, | ||
| "rlist": self.rlist, | ||
| "coulombtype": self.coulombtype, | ||
| "rcoulomb": self.rcoulomb, | ||
| "vdwtype": self.vdwtype, | ||
| "vdw_modifier": self.vdw_modifier, | ||
| "rvdw_switch": self.rvdw_switch, | ||
| "rvdw": self.rvdw, | ||
| "DispCorr": self.DispCorr, | ||
| "fourierspacing": self.fourierspacing, | ||
| "pme_order": self.pme_order, | ||
| "tcoupl": self.tcoupl, | ||
| "tc_grps": self.tc_grps, | ||
| "tau_t": self.tau_t, | ||
| "ref_t": self.ref_t, | ||
| "pcoupl": self.pcoupl, | ||
| "gen_vel": self.gen_vel, | ||
| "constraints": self.constraints, | ||
| "constraint_algorithm": self.constraint_algorithm, | ||
| "continuation": self.continuation, | ||
| "lincs_order": self.lincs_order, | ||
| "lincs_iter": self.lincs_iter, | ||
| } | ||
|
|
||
| return default_params_for_ions_and_em | additional_params # type: ignore | ||
|
|
||
| elif self.sim_type in sim_types and self.sim_type == "nvt": | ||
| additional_params: Dict[str, Any] = { | ||
| "define": self.define, | ||
| "gen_temp": self.gen_temp, | ||
| "gen_seed": self.gen_seed, | ||
| } | ||
|
|
||
| return default_params_for_nvt_npt_md | additional_params # type: ignore | ||
|
|
||
| elif self.sim_type in sim_types and self.sim_type == "npt": | ||
| additional_params: Dict[str, Any] = { | ||
| "define": self.define, | ||
| "pcoupltype": self.pcoupltype, | ||
| "tau_p": self.tau_p, | ||
| "compressibility": self.compressibility, | ||
| "ref_p": self.ref_p, | ||
| "refcoord_scaling": self.refcoord_scaling, | ||
| } | ||
|
|
||
| return default_params_for_nvt_npt_md | additional_params # type: ignore | ||
|
|
||
| elif self.sim_type in sim_types and self.sim_type == "md": | ||
| additional_params: Dict[str, Any] = { | ||
| "pcoupltype": self.pcoupltype, | ||
| "tau_p": self.tau_p, | ||
| "compressibility": self.compressibility, | ||
| "ref_p": self.ref_p, | ||
| # "freezegrps": self.freezegrps, | ||
| # "freezedim": self.freezedim, | ||
| } | ||
|
|
||
| return default_params_for_nvt_npt_md | additional_params # type: ignore | ||
| if self.sim_type == "em": | ||
| additional_params: Dict[str, Any] = { | ||
| "vdwtype": self.vdwtype, | ||
| "vdw_modifier": self.vdw_modifier, | ||
| "rvdw_switch": self.rvdw_switch, | ||
| "DispCorr": self.DispCorr, | ||
| } | ||
|
|
||
| return default_params_for_ions_and_em | additional_params # type: ignore | ||
|
|
||
| elif self.sim_type == "ions": | ||
| return default_params_for_ions_and_em | ||
|
|
||
| elif self.sim_type == "md": | ||
| additional_params: Dict[str, Any] = { | ||
| "pcoupltype": self.pcoupltype, | ||
| "tau_p": self.tau_p, | ||
| "compressibility": self.compressibility, | ||
| "ref_p": self.ref_p, | ||
| # "freezegrps": self.freezegrps, | ||
| # "freezedim": self.freezedim, | ||
| } | ||
|
|
||
| return default_params_for_nvt_npt_md | additional_params # type: ignore | ||
|
|
||
| elif self.sim_type == "npt": | ||
| additional_params: Dict[str, Any] = { | ||
| "define": self.define, | ||
| "pcoupltype": self.pcoupltype, | ||
| "tau_p": self.tau_p, | ||
| "compressibility": self.compressibility, | ||
| "ref_p": self.ref_p, | ||
| "refcoord_scaling": self.refcoord_scaling, | ||
| } | ||
|
|
||
| return default_params_for_nvt_npt_md | additional_params # type: ignore | ||
|
|
||
| elif self.sim_type == "nvt": | ||
| additional_params: Dict[str, Any] = { | ||
| "define": self.define, | ||
| "gen_temp": self.gen_temp, | ||
| "gen_seed": self.gen_seed, | ||
| } | ||
|
|
||
| return default_params_for_nvt_npt_md | additional_params # type: ignore |
There was a problem hiding this comment.
Function MDP.write_mdp_file refactored with the following changes:
- Move assignments closer to their usage [×2] (
move-assign) - Lift repeated conditional into its own if statement (
lift-duplicated-conditional) - Simplify conditional into switch-like form [×3] (
switch)
| key_missing_numbers.append(int("0")) | ||
| key_missing_numbers.append(int("9999")) | ||
|
|
||
| key_missing_numbers.extend((int("0"), int("9999"))) |
There was a problem hiding this comment.
Function checkConsecutive refactored with the following changes:
- Merge consecutive list appends into a single extend (
merge-list-appends-into-extend)
| if first_line != "": | ||
| if first_line in lines_lst[line_idx]: | ||
| output_lst.append(lines_lst[line_idx]) | ||
| try: | ||
| while len(lines_lst[line_idx]) != 0: | ||
| line_idx += 1 | ||
| if len(lines_lst[line_idx]) != 0: | ||
| output_lst.append(lines_lst[line_idx]) | ||
| else: | ||
| break | ||
| except IndexError: | ||
| pass | ||
| if first_line != "" and first_line in lines_lst[line_idx]: | ||
| output_lst.append(lines_lst[line_idx]) | ||
| try: | ||
| while len(lines_lst[line_idx]) != 0: | ||
| line_idx += 1 | ||
| if len(lines_lst[line_idx]) != 0: | ||
| output_lst.append(lines_lst[line_idx]) | ||
| else: | ||
| break | ||
| except IndexError: | ||
| pass |
There was a problem hiding this comment.
Function get_lines refactored with the following changes:
- Merge nested if conditions (
merge-nested-ifs)
| if pdb_type == "COMPLEX" or pdb_type == "PROTEIN": | ||
| if pdb_type in {"COMPLEX", "PROTEIN"}: |
There was a problem hiding this comment.
Function read_topol refactored with the following changes:
- Replace multiple comparisons of same variable with
inoperator [×2] (merge-comparisons) - Replace a for append loop with list extend (
for-append-to-extend) - Simplify generator expression (
simplify-generator) - Use set when checking membership of a collection of literals [×2] (
collection-into-set)
| lig_itp_contents = lines[0:1] + lines[10:] | ||
|
|
||
| return lig_itp_contents | ||
| return lines[:1] + lines[10:] |
There was a problem hiding this comment.
Function write_prm_file refactored with the following changes:
- Inline variable that is immediately returned (
inline-immediately-returned-variable) - Replace a[0:x] with a[:x] and a[x:len(a)] with a[x:] (
remove-redundant-slice-index)
| else: | ||
| count = 0 | ||
| for item in resname_lst: | ||
| if item in amino_acids_3_letter_codes: | ||
| count += 1 | ||
| if count == 0: | ||
| return "LIGAND" | ||
| else: | ||
| return "COMPLEX" | ||
| count = sum(item in amino_acids_3_letter_codes for item in resname_lst) | ||
| return "LIGAND" if count == 0 else "COMPLEX" |
There was a problem hiding this comment.
Function check_pdb_type refactored with the following changes:
- Remove unnecessary else after guard condition (
remove-unnecessary-else) - Convert for loop into call to sum() (
sum-comprehension) - Replace if statement with if expression (
assign-if-exp) - Remove unnecessary casts to int, str, float or bool (
remove-unnecessary-cast) - Simplify constant sum() call (
simplify-constant-sum)
Sourcery Code Quality Report✅ Merging this PR will increase code quality in the affected files by 3.36%.
Here are some functions in these files that still need a tune-up:
Legend and ExplanationThe emojis denote the absolute quality of the code:
The 👍 and 👎 indicate whether the quality has improved or gotten worse with this pull request. Please see our documentation here for details on how these metrics are calculated. We are actively working on this report - lots more documentation and extra metrics to come! Help us improve this quality report! |
Pull Request #1 refactored by Sourcery.
If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.
NOTE: As code is pushed to the original Pull Request, Sourcery will
re-run and update (force-push) this Pull Request with new refactorings as
necessary. If Sourcery finds no refactorings at any point, this Pull Request
will be closed automatically.
See our documentation here.
Run Sourcery locally
Reduce the feedback loop during development by using the Sourcery editor plugin:
Review changes via command line
To manually merge these changes, make sure you're on the
feature/main_menubranch, then run:Help us improve this pull request!